2024: Our Second “Summer of AI” Begins!
I remember when I finally got the “AI bug.”
It was late March, 2023. I travelled to Wichita, Kansas for the funeral of my last remaining uncle on my dad’s side.
Sitting the the St. Louis Airport, I fired up my laptop and I showed my dad ChatGPT and we started asking it questions. He asked about “dual use technology” (somehow fitting for AI) and we thought the answer was reasonable. While my 20-something teenage son was exploring downtown Wichita, I started playing around with code generation and GitHub Copilot to generate Python Boto code.
It wasn’t great, but I felt I’d held out long enough.
I hadn’t given ChatGPT a look-see even when I was on Twitter. While we waited to return to BWI, I started subscribing to a whole lot of podcasts and I began recording links at https://github.com/mdfranz/cheetsheetz/tree/main/ai and building my AI Learning repo that is still on my Gitlab site.
At work, I started a Friday knowledge sharing session within the Security and posed the following question
Only one of these I made progress on and this was the topic of a presentation I gave in late April.
https://www.youtube.com/watch?v=5m3VaBP5UQc
Having fallen down the ElasticSearch on Kubernetes rabbit hole since then, I really haven’t had the time to focus on AI much, but this weekend when I picked up my daughter from a friends apartment in College Park, I listed to a few podcasts that get honorable mention are a good way to mark the time and what I’ve learned — and more importantly what others are talking about.
Forget about AI PCs, but do consider local LLMs!
The Practical AI Podcast is one of my favorites that is fairly light, but not too TechBro-ish, unlike some other podcasts that I’ve stopped listening to.
I was also pleased that there was absolutely no mention of Recall which has popped up on Infosec.Exchange enough that I’m sick of it. If you are already an Ollama wizard this is not for you, but for those that aren’t familiar with it or file-format on HuggingFace, I’d encourage you to check it out! It was a good refresher — and reassurance that I was on the right track.
One of the themes was where computing power lives. Think Mainframe vs. Dumb Terminal. Think Client vs. Server. Cloud vs. On Prem — and whatever the hell edge computing is. The same applies to training and inferencing. What can be local? What must run in the cloud? What can run on a mobile device.
AI and Open Source: the Long View
The next podcast provides the same perspective on Open Source. Like, me the podcasters can remember the days when Open Source Software was not taken for granted. I remember when the Open Source approval process was launched at Cisco 20 years ago. I remember the late 1990s when Linux was an oddity.
This is a heavier podcast getting into governance and security (and the business of Open Source), but also worth it — and also a validation that Open Source LLMs are truly revolution and radical, despite whatever Microsoft and OpenAI are doing!
It is still late Spring in Maryland though!
As my wife laments, Summer really doesn’t come to Maryland until July. Sure there are a few times in May when things hit the 90s but it really stays mild for another few weeks.