OurBigBook Wikipedia Bot Documentation
Foundation models are large-scale machine learning models trained on diverse data sources to perform a wide range of tasks, often with little to no fine-tuning. These models, such as GPT (Generative Pre-trained Transformer), BERT (Bidirectional Encoder Representations from Transformers), and others, serve as a foundational platform upon which more specialized models can be built.

Ancestors (6)

  1. Computational fields of study
  2. Computational science
  3. Applied mathematics
  4. Fields of mathematics
  5. Mathematics
  6. Home