Machine Learning Times
EXCLUSIVE HIGHLIGHTS
2 More Ways To Hybridize Predictive AI And Generative AI
  Originally published in Forbes Predictive AI and generative AI...
How To Overcome Predictive AI’s Everyday Failure
  Originally published in Forbes Executives know the importance of predictive...
Our Last Hope Before The AI Bubble Detonates: Taming LLMs
  Originally published in Forbes To know that we’re in...
The Agentic AI Hype Cycle Is Out Of Control — Yet Widely Normalized
  Originally published in Forbes I recently wrote about how...

language models

The Biggest Bottleneck for Large Language Model Startups is UX

 Originally published in Innovation Endeavors, Nov 1, 2022.  Applied large language model startups have exploded in the past year. Enormous advances in underlying language modeling technology, coupled with the early success of products like Github CoPilot, have led to a huge array of founders using LLMs to rethink workflows ranging from code reviews to copywriting to analyzing unstructured product feedback.

Getting Tabular Data from Unstructured Text with GPT-3: An Ongoing Experiment

 Originally published by Roberto Rocha. One of the most exciting applications of AI in journalism is the creation of structured data from unstructured text. Government reports, legal documents, emails, memos… these are rich with content like names,...

Productizing Large Language Models

 Originally posted on Replit.com, Sept 21, 2022.  Large Language Models (LLMs) are known for their near-magical ability to learn from very few examples — as little as zero — to create language wonders. LLMs can chat, write...

Using DeepSpeed and Megatron to Train Megatron-Turing NLG 530B, the World’s Largest and Most Powerful Generative Language Model

 Originally published in Microsoft Research Blog, Oct 11, 2021 We are excited to introduce the DeepSpeed- and Megatron-powered Megatron-Turing Natural Language Generation model (MT-NLG), the largest and the most powerful monolithic transformer language model trained to date,...