To install click the Add extension button. That's it.

The source code for the WIKI 2 extension is being checked by specialists of the Mozilla Foundation, Google, and Apple. You could also do it yourself at any point in time.

4,5
Kelly Slayton
Congratulations on this excellent venture… what a great idea!
Alexander Grigorievskiy
I use WIKI 2 every day and almost forgot how the original Wikipedia looks like.
Live Statistics
English Articles
Improved in 24 Hours
Added in 24 Hours
What we do. Every page goes through several hundred of perfecting techniques; in live mode. Quite the same Wikipedia. Just better.
.
Leo
Newton
Brights
Milds

Flux (machine-learning framework)

From Wikipedia, the free encyclopedia

Flux
Original author(s)Michael J Innes,[1] Dhairya Gandhi,[2] and Contributors[3]
Stable release
0.14.5[4] Edit this on Wikidata / 7 September 2023; 7 months ago (7 September 2023)
Repositorygithub.com/FluxML/Flux.jl
Written inJulia
TypeMachine learning library
LicenseMIT[5]
Websitehttps://fluxml.ai

Flux is an open-source machine-learning software library and ecosystem written in Julia.[1][6] Its current stable release is v0.14.5[4] Edit this on Wikidata. It has a layer-stacking-based interface for simpler models, and has a strong support on interoperability with other Julia packages instead of a monolithic design.[7] For example, GPU support is implemented transparently by CuArrays.jl.[8] This is in contrast to some other machine learning frameworks which are implemented in other languages with Julia bindings, such as TensorFlow.jl (the unofficial wrapper, now deprecated), and thus are more limited by the functionality present in the underlying implementation, which is often in C or C++.[9] Flux joined NumFOCUS as an affiliated project in December of 2021.[10]

Flux's focus on interoperability has enabled, for example, support for Neural Differential Equations, by fusing Flux.jl and DifferentialEquations.jl into DiffEqFlux.jl.[11][12]

Flux supports recurrent and convolutional networks. It is also capable of differentiable programming[13][14][15] through its source-to-source automatic differentiation package, Zygote.jl.[16]

Julia is a popular language in machine-learning[17] and Flux.jl is its most highly regarded machine-learning repository[17] (Lux.jl is another more recent, that shares a lot of code with Flux.jl). A demonstration[18] compiling Julia code to run in Google's tensor processing unit (TPU) received praise from Google Brain AI lead Jeff Dean.[19]

Flux has been used as a framework to build neural networks that work with homomorphic encrypted data without ever decrypting it.[20][21] This kind of application is envisioned to be central for privacy to future API using machine-learning models.[22]

Flux.jl is an intermediate representation for running high level programs on CUDA hardware.[23][24] It was the predecessor to CUDAnative.jl which is also a GPU programming language.[25]

YouTube Encyclopedic

  • 1/5
    Views:
    10 628
    3 818
    2 307
    2 394
    1 737
  • Building Deep Learning Models in Flux.jl (4 minute tour)
  • JuliaCon 2017 | Flux: Machine Learning with Julia | Mike Innes
  • Learning Flux.jl from a Tensorflow Background | Talk Julia #9
  • Flux: Enabling Modern Supercomputing Workflows
  • Federated learning framework for mobile devices

Transcription

See also

References

  1. ^ a b Innes, Michael (2018-05-03). "Flux: Elegant machine learning with Julia". Journal of Open Source Software. 3 (25): 602. Bibcode:2018JOSS....3..602I. doi:10.21105/joss.00602.
  2. ^ Dhairya Gandhi, GitHub, 2021-06-27, retrieved 2021-06-27
  3. ^ Flux Contributors, GitHub, 2021-06-27, retrieved 2021-06-27
  4. ^ a b "Flux v0.14.5". 7 September 2023. Retrieved 10 September 2023.
  5. ^ "github.com/FluxML/Flux.jl/blob/master/LICENSE.md". GitHub. 6 November 2021.
  6. ^ Innes, Mike; Bradbury, James; Fischer, Keno; Gandhi, Dhairya; Mariya Joy, Neethu; Karmali, Tejan; Kelley, Matt; Pal, Avik; Concetto Rudilosso, Marco; Saba, Elliot; Shah, Viral; Yuret, Deniz. "Building a Language and Compiler for Machine Learning". julialang.org. Retrieved 2019-06-02.
  7. ^ "Machine Learning and Artificial Intelligence". juliacomputing.com. Archived from the original on 2019-06-02. Retrieved 2019-06-02.
  8. ^ Gandhi, Dhairya (2018-11-15). "Julia at NeurIPS and the Future of Machine Learning Tools". juliacomputing.com. Archived from the original on 2019-06-02. Retrieved 2019-06-02.
  9. ^ Malmaud, Jonathan; White, Lyndon (2018-11-01). "TensorFlow.jl: An Idiomatic Julia Front End for TensorFlow". Journal of Open Source Software. 3 (31): 1002. Bibcode:2018JOSS....3.1002M. doi:10.21105/joss.01002.
  10. ^ "Flux <3 NumFOCUS". fluxml.ai. Archived from the original on 2021-12-01. Retrieved 2021-01-12.
  11. ^ Rackauckas, Chris; Innes, Mike; Ma, Yingbo; Bettencourt, Jesse; White, Lyndon; Dixit, Vaibhav (2019-02-06). "DiffEqFlux.jl - A Julia Library for Neural Differential Equations". arXiv:1902.02376 [cs.LG].
  12. ^ Schlothauer, Sarah (2019-01-25). "Machine learning meets math: Solve differential equations with new Julia library". JAXenter. Retrieved 2019-10-21.
  13. ^ "Flux – Reinforcement Learning vs. Differentiable Programming". fluxml.ai. Archived from the original on 2019-03-27. Retrieved 2019-06-02.
  14. ^ "Flux – What Is Differentiable Programming?". fluxml.ai. Archived from the original on 2019-03-27. Retrieved 2019-06-02.
  15. ^ Heath, Nick (December 6, 2018). "Julia vs Python: Which programming language will rule machine learning in 2019?". TechRepublic. Retrieved 2019-06-03.
  16. ^ Innes, Michael (2018-10-18). "Don't Unroll Adjoint: Differentiating SSA-Form Programs". arXiv:1810.07951 [cs.PL].
  17. ^ a b Heath, Nick (January 25, 2019). "GitHub: The top 10 programming languages for machine learning". TechRepublic. Retrieved 2019-06-03.
  18. ^ Saba, Elliot; Fischer, Keno (2018-10-23). "Automatic Full Compilation of Julia Programs and ML Models to Cloud TPUs". arXiv:1810.09868 [cs.PL].
  19. ^ Dean, Jeff [@JeffDean] (October 24, 2018). "Julia + TPUs = fast and easily expressible ML computations" (Tweet). Retrieved 2019-06-02 – via Twitter.
  20. ^ Patrawala, Fatema (2019-11-28). "Julia Computing research team runs machine learning model on encrypted data without decrypting it". Packt Hub. Retrieved 2019-12-11.
  21. ^ "Machine Learning on Encrypted Data Without Decrypting It". juliacomputing.com. 2019-11-22. Archived from the original on 2019-12-03. Retrieved 2019-12-11.
  22. ^ Yadav, Rohit (2019-12-02). "Julia Computing Uses Homomorphic Encryption For ML. Is It The Way Forward?". Analytics India Magazine. Retrieved 2019-12-11.
  23. ^ Roesch, Jared and Lyubomirsky, Steven and Kirisame, Marisa and Pollock, Josh and Weber, Logan and Jiang, Ziheng and Chen, Tianqi and Moreau, Thierry and Tatlock, Zachary (2019). "Relay: A High-Level IR for Deep Learning". arXiv:1904.08368 [cs.LG].{{cite arXiv}}: CS1 maint: multiple names: authors list (link)
  24. ^ Tim Besard and Christophe Foket and Bjorn De Sutter (2019). "Effective Extensible Programming: Unleashing Julia on GPUs". IEEE Transactions on Parallel and Distributed Systems. 30 (4). Institute of Electrical and Electronics Engineers (IEEE): 827–841. arXiv:1712.03112. doi:10.1109/tpds.2018.2872064. S2CID 11827394.
  25. ^ Besard, Tim (2018). Abstractions for Programming Graphics Processors in High-Level Programming Languages (PhD). Ghent University.
This page was last edited on 30 April 2024, at 14:38
Basis of this page is in Wikipedia. Text is available under the CC BY-SA 3.0 Unported License. Non-text media are available under their specified licenses. Wikipedia® is a registered trademark of the Wikimedia Foundation, Inc. WIKI 2 is an independent company and has no affiliation with Wikimedia Foundation.