To install click the Add extension button. That's it.

The source code for the WIKI 2 extension is being checked by specialists of the Mozilla Foundation, Google, and Apple. You could also do it yourself at any point in time.

4,5
Kelly Slayton
Congratulations on this excellent venture… what a great idea!
Alexander Grigorievskiy
I use WIKI 2 every day and almost forgot how the original Wikipedia looks like.
Live Statistics
English Articles
Improved in 24 Hours
Added in 24 Hours
What we do. Every page goes through several hundred of perfecting techniques; in live mode. Quite the same Wikipedia. Just better.
.
Leo
Newton
Brights
Milds

From Wikipedia, the free encyclopedia

Granite
Developer(s)IBM Research[1]
Initial releaseNovember 7, 2023; 6 months ago (2023-11-07)
PlatformIBM Watsonx (initially)
GitHub
Hugging Face
RHEL AI
Type
LicenseProprietary
Code models: Open Source (Apache 2.0)[2]

IBM Granite is a series of decoder-only foundation models created by IBM. It was announced on September 7, 2023,[3][4] and an initial paper was published 4 days later.[5] Initially intended for use in the IBM's cloud-based data and generative AI platform Watsonx along with other models,[6] IBM opened the source code of some code models.[7] Granite models are trained on datasets curated from Internet, academic publishings, code datasets, legal and finance documents.[8][9][1]

YouTube Encyclopedic

  • 1/1
    Views:
    796
  • watsonx | RAG using IBM's watsonx.ai

Transcription

Foundation models

A foundation model is an AI model trained on broad data at scale such that it can be adapted to a wide range of downstream tasks.[10]

Granite's first foundation models were Granite.13b.instruct and Granite.13b.chat. The "13b" in their name comes from 13 billion, the amount of parameters they have as models, lesser than most of the larger models of the time. Later models vary from 3 to 34 billion parameters.[3][11]

On May 6, 2024, IBM released the source code of four variations of Granite Code Models under Apache 2, an open source permissive license that allows completely free use, modification and sharing of the software, and put them on Hugging Face for public use.[12][13] According to IBM's own report, Granite 8b outperforms Llama 3 on several coding related tasks within similar range of parameters.[14][15]

See also

References

  1. ^ a b McDowell, Steve. "IBM's New Granite Foundation Models Enable Safe Enterprise AI". Forbes.
  2. ^ ibm-granite/granite-code-models, IBM Granite, 2024-05-08, retrieved 2024-05-08
  3. ^ a b Nirmal, Dinesh (September 7, 2023). "Building AI for business: IBM's Granite foundation models". IBM.
  4. ^ "IBM debuts Granite series of hardware-efficient language models". September 7, 2023.
  5. ^ "Granite Foundation Models" (PDF). IBM. 2023-11-30.
  6. ^ Fritts, Harold (2024-04-22). "IBM Adds Meta Llama 3 To watsonx, Expands AI Offerings". StorageReview.com. Retrieved 2024-05-08.
  7. ^ Jindal, Siddharth (2024-05-07). "IBM Releases Open-Source Granite Code Models, Outperforms Llama 3". Analytics India Magazine. Retrieved 2024-05-08.
  8. ^ Azhar, Ali (2024-04-08). "IBM Patents a Faster Method to Train LLMs for Enterprises". Datanami. Retrieved 2024-05-08.
  9. ^ Wiggers, Kyle (2023-09-07). "IBM rolls out new generative AI features and models". TechCrunch. Retrieved 2024-05-08.
  10. ^ "Introducing the Center for Research on Foundation Models (CRFM)". Stanford HAI. 18 August 2021.
  11. ^ Pawar, Sahil (2023-09-11). "IBM Introduces Granite Series LLM Models for Watsonx Platform". Analytics Drift. Retrieved 2024-05-09.
  12. ^ Nine, Adrianna (May 7, 2024). "IBM Makes Granite AI Models Open-Source Under New InstructLab Platform". ExtremeTech.
  13. ^ "IBM open-sources its Granite AI models - and they mean business". ZDNET. Retrieved 2024-05-21.
  14. ^ Jindal, Siddharth (2024-05-07). "IBM Releases Open-Source Granite Code Models, Outperforms Llama 3". Analytics India Magazine. Retrieved 2024-05-09.
  15. ^ Synced (2024-05-13). "IBM's Granite Code: Powering Enterprise Software Development with AI Precision | Synced". syncedreview.com. Retrieved 2024-05-21.

External links

This page was last edited on 21 May 2024, at 11:22
Basis of this page is in Wikipedia. Text is available under the CC BY-SA 3.0 Unported License. Non-text media are available under their specified licenses. Wikipedia® is a registered trademark of the Wikimedia Foundation, Inc. WIKI 2 is an independent company and has no affiliation with Wikimedia Foundation.