To install click the Add extension button. That's it.

The source code for the WIKI 2 extension is being checked by specialists of the Mozilla Foundation, Google, and Apple. You could also do it yourself at any point in time.

4,5
Kelly Slayton
Congratulations on this excellent venture… what a great idea!
Alexander Grigorievskiy
I use WIKI 2 every day and almost forgot how the original Wikipedia looks like.
Live Statistics
English Articles
Improved in 24 Hours
Added in 24 Hours
Languages
Recent
Show all languages
What we do. Every page goes through several hundred of perfecting techniques; in live mode. Quite the same Wikipedia. Just better.
.
Leo
Newton
Brights
Milds

From Wikipedia, the free encyclopedia

Nir Shavit (Hebrew: ניר שביט) is an Israeli computer scientist. He is a professor in the Computer Science Department at Tel Aviv University and a professor of electrical engineering and computer science at the Massachusetts Institute of Technology.

Nir Shavit received B.Sc. and M.Sc. degrees in computer science from the Technion - Israel Institute of Technology in 1984 and 1986, and a Ph.D. in computer science from the Hebrew University of Jerusalem in 1990. Shavit is a co-author of the book The Art of Multiprocessor Programming, is a winner of the 2004 Gödel Prize in theoretical computer science for his work on applying tools from algebraic topology to model shared memory computability, and a winner of the 2012 Dijkstra Prize for the introduction and first implementation of software transactional memory. He is a past program chair of the ACM Symposium on Principles of Distributed Computing (PODC) and the ACM Symposium on Parallelism in Algorithms and Architectures (SPAA).

He heads up the Computational Connectomics Group at MIT's Computer Science and Artificial Intelligence Laboratory, focusing on techniques for designing, implementing, and reasoning about multiprocessors, and in particular the design of concurrent data structures for multi-core machines.

YouTube Encyclopedic

  • 1/3
    Views:
    1 389
    6 034
    736
  • High Throughput Connectomics
  • 17. Synchronization Without Locks
  • Topology Meets Asynchronous Computing (ft. Maurice Herlihy)

Transcription

Recognition

Currently he has co-founded a company named Neural Magic along with Alexzander Mateev. The company claims to use highly sparse neural networks to make deep learning computationally so efficient that GPUs won't be needed. For certain use cases they claim a speed up of 175x.[2]

References

  1. ^ ACM Names Fellows for Computing Advances that Are Transforming Science and Society Archived 2014-07-22 at the Wayback Machine, Association for Computing Machinery, accessed 2013-12-10.
  2. ^ "The Future of Deep Learning is Sparse. - Neural Magic". 12 July 2019.

External links


This page was last edited on 6 April 2024, at 22:14
Basis of this page is in Wikipedia. Text is available under the CC BY-SA 3.0 Unported License. Non-text media are available under their specified licenses. Wikipedia® is a registered trademark of the Wikimedia Foundation, Inc. WIKI 2 is an independent company and has no affiliation with Wikimedia Foundation.