Google PaLM 2 Language Model
(Redirected from PalM 2)
Jump to navigation
Jump to search
A Google PaLM 2 Language Model is an Google foundation LLM.
- Context:
- It can make use of a Embedding for Text System, such as:
textembedding-gecko@001
- It can be a successor to a PaLM 1 Model.
- It is a predecessor to Google Gemini.
- It can be trained on many publicly available source code datasets, including Python Code Datasets and JavaScript Code Datasets.
- It can be heavily trained on multilingual text, spanning more than 100 languages.
- It can be faster and more efficient than previous models.
- It can be accessed via a Google LLM API or a GCP LLM Web Console [1].
- …
- It can make use of a Embedding for Text System, such as:
- Example(s):
- PaLM 2, Gecko.
- PaLM 2, Otter.
- PaLM 2, Bison (
text-bison@001
for text [1],chat-bison@001
for chat). - PaLM 2, Unicorn.
- …
- PaLM 2 (Cody).
- PaLM 2 (Vision).
- …
- Counter-Example(s):
- another Google LLM, such as: PaLM Model.
- a OpenAI LLM (GPT-4 LLM).
- an Anthropic LLM (Claude 2 LLM).
- See: Vertex AI Model Garden, Med-PaLM 2 LM.
References
2023
- (Google Blog, 2023) ⇒ https://blog.google/technology/ai/google-palm-2-ai-large-language-model/
- QUOTE: Building on this work, today we’re introducing PaLM 2, our next generation language model. PaLM 2 is a state-of-the-art language model with improved multilingual, reasoning and coding capabilities.
- Multilinguality: PaLM 2 is more heavily trained on multilingual text, spanning more than 100 languages. This has significantly improved its ability to understand, generate and translate nuanced text — including idioms, poems and riddles — across a wide variety of languages, a hard problem to solve. PaLM 2 also passes advanced language proficiency exams at the “mastery” level.
- Reasoning: PaLM 2’s wide-ranging dataset includes scientific papers and web pages that contain mathematical expressions. As a result, it demonstrates improved capabilities in logic, common sense reasoning, and mathematics.
- Coding: PaLM 2 was pre-trained on a large quantity of publicly available source code datasets. This means that it excels at popular programming languages like Python and JavaScript, but can also generate specialized code in languages like Prolog, Fortran and Verilog.
- A versatile family of models
- Even as PaLM 2 is more capable, it’s also faster and more efficient than previous models — and it comes in a variety of sizes, which makes it easy to deploy for a wide range of use cases. We’ll be making PaLM 2 available in four sizes from smallest to largest: Gecko, Otter, Bison and Unicorn. Gecko is so lightweight that it can work on mobile devices and is fast enough for great interactive applications on-device, even when offline. This versatility means PaLM 2 can be fine-tuned to support entire classes of products in more ways, to help more people.
- QUOTE: Building on this work, today we’re introducing PaLM 2, our next generation language model. PaLM 2 is a state-of-the-art language model with improved multilingual, reasoning and coding capabilities.