weekend ai reads for 2023-02-24

🏗️ FOUNDATIONS

  • What ChatGPT is doing and why does it work? Stephen Wolfram

  • via Patrick (and the following two links, as well): ChatGPT is built around a large language model (LLM), and so it is worth reading on LLM advances generally. Specifically, OpenAI based the technology on GPT (Generative Pre-trained Transformer), a common underlying language model. However, OpenAI's "secret sauce" is two models on top of GPT: (1) a “reward” model based on human ranking of dialogue responses from best to worst, and (2) a reinforcement learning model adjusting GPT to mimic human conversation based on alignment to that reward model.

  • This blog item from OpenAI explains their work nicely. Here’s their technical paper [PDF].

🎓 EDUCATION and AI

📊 DATA & TECHNOLOGY

  • Big Data is Dead Jordan Tigani, formerly of BigQuery at Google

    • Sample of the table of contents: “Workload sizes are smaller than overall data sizes; Most data is rarely queried; The Big Data Frontier keeps receding; Data is a Liability”

  • via Patrick: How Do We Fix and Update Large Language Models? Human-centered Artificial Intelligence at Stanford University

  • Well, BioGPT is here. Twitter (sorry)

    • Only a matter of time before someone takes my EduGPT idea and makes hundreds of dollars off of it. (Kidding; my EduGPT idea is very different to this, and it would make dozens of dollars, not hundreds)

🎉 FUN and/or PRACTICAL THINGS