Machine Learning Times
Machine Learning Times
EXCLUSIVE HIGHLIGHTS
AI and ML in Health Care: A Brief Review
 Of the many disciplines that are active users of...
Visualizing Decision Trees with Pybaobabdt
 Originally published in Towards Data Science, Dec 14, 2021....
Correspondence Analysis: From Raw Data to Visualizing Relationships
 Isn’t it satisfying to find a tool that makes...
Podcast: Four Things the Machine Learning Industry Must Learn from Self-Driving Cars
    Welcome to the next episode of The Machine...

language models

The Biggest Bottleneck for Large Language Model Startups is UX

 Originally published in Innovation Endeavors, Nov 1, 2022.  Applied large language model startups have exploded in the past year. Enormous advances in underlying language modeling technology, coupled with the early success of products like Github CoPilot, have led to a huge array of founders using LLMs to rethink workflows ranging from code reviews to copywriting to analyzing unstructured product feedback.

Getting Tabular Data from Unstructured Text with GPT-3: An Ongoing Experiment

 Originally published by Roberto Rocha. One of the most exciting applications of AI in journalism is the creation of structured data from unstructured text. Government reports, legal documents, emails, memos… these are rich with content like names,...

Productizing Large Language Models

 Originally posted on Replit.com, Sept 21, 2022.  Large Language Models (LLMs) are known for their near-magical ability to learn from very few examples — as little as zero — to create language wonders. LLMs can chat, write...

Using DeepSpeed and Megatron to Train Megatron-Turing NLG 530B, the World’s Largest and Most Powerful Generative Language Model

 Originally published in Microsoft Research Blog, Oct 11, 2021 We are excited to introduce the DeepSpeed- and Megatron-powered Megatron-Turing Natural Language Generation model (MT-NLG), the largest and the most powerful monolithic transformer language model trained to date,...