• Rich Tong Family and Friends

  • By: Rich Tong
  • Podcast

Rich Tong Family and Friends

By: Rich Tong
  • Summary

  • Now with rotating cohosts Rich Tong of tongfamily.com fame and a rotating set of co-hosts like Steven, Paul, Mike and Deon cover all the current news, all things Apple, Smart Home, and Smart EDC with Paul. And all things Startups, Robotics, Smart Health, and Artificial Intelligence with Deon. Or whatever they want to talk about. Tips, tricks, and traps on technology, mobile, software, machine learning, and shopping since 1996 at https://tongfamily.com
    Rich Tong
    Show More Show Less
Episodes
  • ST1. August Update on AI 2024-08-28
    Oct 8 2024

    There's been a hiatus mainly because we've been shipping products, but more because I lost all my skills at making new videos. I was stuck for a long time on Drop Zones and doing a better intro and outro, but that's a digression for Final Cut Pro nerds.

    In this episode, we cover the latest trends as of August 28, 2024, on AI and the latest trends. The big news has been the shipment of so many Large Language Models (LLMs) and what it means for AI which is way more choice and confusion.

    Plus the emergence of a much better set of tools that are much smaller as well called Small Language Models (SLM sometimes) and also Agents that are chopping the problem into many small pieces.

    I also wanted to introduce Steven to the mix, we are going to have a rotating set of co-hosts and solo episodes as well so we can get the content out on time and not take 6 weeks for post productions. Thanks to the new intro and outro, that should be easier. I'm playing with these a bunch, but check out https://tongfamily.com and https://tne.ai where I hang out a bunch!

    Show More Show Less
    53 mins
  • RT3. AI Hardware Introduction
    Mar 9 2024

    This is another sort of nerdy side note. If anyone is still watching, this section is just to give intuition on the basics of the hardware. There are lots of assumptions about GPUs and CPUs that I wanted to make sure people understood.But the basics are that CPUs are tuned for lots of branches and different workflows, while GPUs are tuned for lots of the same things like matric math. And because they are so fast, most of the job of the computer folks is "feeding the beast". That is caching the most frequently used information so they don't have to wait. There are some errors I think in the levels of the CPU and GPU performance particularly in the cache performance as it is not very clear how this works and the results of course vary depending on the models of processors, so these are all approximations. Put in comments better sources. I have all the sources listed in a spreadsheet that is part of this. We are happy to send this to anyone who wants the source data. I'll fix these errors in later editions (as I'm obsessive that way)

    .Also, I'm quite proud of the HDR mix, using the latest OBS settings, producing in HDR in Final Cut Pro, and adjusting the video scope levels helps. The audio is a little hot and I'm sorry about that, I'll turn it down next time, I stoo much time in the red. My Scarlett needs to about 1 O'clock and it works.

    See https://youtu.be/FupclouzYTI for a video version. And more details at https://tongfamily.com/2024/03/08/pod-rt3-ai-hardware-introduction/


    Chapters: 00:00 AI Hardware Introduction 00:42 Computer Engineering in Two Slides 05:40 It's 165 Years to a Single Disk Access?!! 14:12 Intel Architecture CPU 17:03 What's all this about NVidia 25:24 And now for something completely different, Apple 29:45 Introduction Summary

    2024-03 -08 Shot as UHD HEVC HDR PQ 10 bit using OBS Studio and Final Cut Pro© 2024. Iron Snow Technologies, LLC. All Rights Reserved.

    Show More Show Less
    30 mins
  • DT6. AI Intro and Intuitions
    Mar 8 2024

    OK, we are not experts nor PhDs so most of this is probably not technically correct, but the math is so complicated and the concepts so complicated, that we thought it would be good to just get some intuition on what is happening. So this is a quick talk that summarizes readings from so many different sources about the history of AI from the 1950s all the way to January 2024 or so. It is really hard to keep up but much harder without some intuition about what is happening. We cover expert systems, the emergence of neural networks, Convolution Neural Networks, and Recurrent Neural Networks. The Attention is All You Need paper led to Transformers and then finally some intuition on how such a simple idea, that is training on things can lead to emergent and unexpected behaviors, and finally, some intuition on how Generative images work.

    You can go to YouTube to see the slides we are using at YouTube and more information at Tongfamily.com

    Chapters:

    • 23:25 Attention is all you need Transformers
    • 27:36 But it's too Simple! Emergence is surprising
    • 33:04 What emerges inside a Transformer?
    • 43:01 One Model to Rule Them All
    • 47:54 Works for Image generation too


    Show More Show Less
    53 mins

What listeners say about Rich Tong Family and Friends

Average customer ratings

Reviews - Please select the tabs below to change the source of reviews.