Difference between revisions of "LLaMA2"

From wikieduonline
Jump to navigation Jump to search
 
Line 1: Line 1:
[[wikipedia:LLaMA2]]
+
[[wikipedia:LLaMA2]] pretrained models are trained on 2 [[trillion]] [[tokens]].
 
* https://ai.meta.com/llama/
 
* https://ai.meta.com/llama/
  

Latest revision as of 19:24, 22 December 2023

Advertising: