Welcome to the 63rd issue of Python Monthly!
If it’s your first time here, welcome, I like you already. If you want the full back story on this monthly newsletter, head here.
The quick version: I curate and share the most important Python articles, news, resources, podcasts, and videos.
Think the Pareto Principle (80/20 rule) meeting the Python world. I give you the 20% that will get you 80% of the results.
If you're a long time reader, welcome back old friend.
Alright, let's not waste any valuable time and jump right into this month's updates.
Ever had a Python function behave strangely, remembering values between calls when it shouldn’t? You’re not alone! This is one of Python’s sneakiest pitfalls—mutable default parameters. Learn about them here.
A perfect way to spend your weekend: compile the original Python 1.0! Python is now 31 years old, so it's time to look back in time and follow this guide and do the impossible: see what Python really looked like when the first stable release came out.
NVIDIA's CUDA (Compute Unified Device Architecture) is a platform and C++ extension that lets us write programs to run on the GPU. CUDA provides a programming model and APIs so developers can write code that runs directly on the GPU, unlocking significant performance gains by offloading parallelizable work from the CPU to the GPU. In essence, you break your problem into many small pieces that can be solved at the same time (like giving each GPU core a tiny task).
Learn how it all works using Python in this great introductory guide.
PyPI now supports iOS and Android wheels, making it easier for Python developers to distribute mobile packages.
Software development topics I've changed my mind on after 10 years in the industry... well not me exactly, but the author of this post. I tend to agree with most of these. It's important to be flexible and willing to change your opinions throughout your career.
I have been building so many small products using LLMs. It has been fun, and useful. However, there are pitfalls that can waste so much time. A while back a friend asked me how I was using LLMs to write software. I thought “oh boy. how much time do you have!” and thus this post.
This is a great read and good insights even if you don't use any AI tools.
Want to learn SQL in a fun way? More fun than the ZTM SQL Bootcamp? No such thing.. but here is the second best option: SQL Noire. Solve a murder mystery with SQL and have fun.
This month, Andrej Karpathy (famous AI researcher and educator) released a video titled "Deep dive into LLMs like ChatGPT." It’s a goldmine of information, but it’s also 3 hours and 31 minutes long. This author watched the whole thing and made a TL;DR version for anyone who wants the essential takeaways without the large time commitment.
A nice short read on some good developer principles and philosophy from a senior. My favourite one is "There is usually a simpler way to write it".
El Salvador abandoned Bitcoin as legal tender. The main reason they did this was because IMF (think a bank that countries get loans from) made this a condition for a loan of 1.4 billion US dollars, as they thought that strategy was too risky.
Argentinian president made a classic oopsie with crypto: Argentina's president Javier Milei has backtracked on a tweet promoting a memecoin called Libra, which rose to a $4.4 billion market cap before plunging by more than 95%.
In case you have Impostor Syndrome: A young computer scientist and two colleagues show that searches within data structures called hash tables can be much faster than previously (40 years) deemed possible.
Speaking of DeepSeek, they are on a mission to continue and open source a bunch of their work. They released 5 big open source projects this month.
Google announced community release of Gemini 2.0: their best model yet for coding performance and complex prompts.
Anthropic had a very big month. First they release a report on how their AI tools are being used: turns out it's mostly the software industry. Then they released probably the best coding AI out there: Claude 3.7 Sonnet and Claude Code. If Claude Code really gets going, it can change how you code dramatically... it looks very impressive.
X announced Grok3 and it promises to be the unfiltered version of AI.
Microsoft had a big Quantum Computing breakthrough this month: Majorana 1, a quantum chip using topological superconductors and theoretical particles made real. The eight-qubit chip aims to scale to one million qubits, potentially creating computers more powerful than all current ones combined.
In case you love the show Severance: go refine some microdata.
Be an artist.
This is the most appropriate link for this section title. Interesting facts about obscure islands.
Everything that happens when you enter "google.com" into your browser.
This is an 18 lesson slide presentation by 2 professors. It asks an important question about LLMs and AI: Are they modern day oracles or are they bullshit machines? I promise that if you take the time and go through the 18 lessons, you will come out the other end better informed about AI tools and how they work. This was my favourite resource of the month!
Don't let the title fool you though. The lessons do a good job balancing out skepticism with practicality of using these tools.
This lesson is especially important:
AI chatbots are designed to be anthropoglossic: able to speak, write, and converse in human-like fashion. When we interact with anthropoglossic systems, we naturally assume they have the full range human capabilities. They don’t.
When it comes to bullshitting anyone about anything, an LLM has a huge advantage over any human.
People use language in ways that signal belonging. In a social situation, one might use particular type of slang; in an academic paper, one might rely on certain forms of jargon. But each of us has limited experience and expertise. We only belong to a few social groups; we are only expert in a few domains. We don't have the insider knowledge to speak in the codes of groups we don't belong to.
ChatGPT has access to many of the codes of many different groups. As a result, it is better than a human outsider at mimicking the modes and patterns of speech, the dialects and slang and jargon of the groups that are well-represented in its training set.
Therein lies its superhuman bullshitting ability. I have one perspective; it has been trained on millions of perspectives. I can’t go bullshit a bunch of radiologists at a radiologist convention, but ChatGPT possibly could. At least it could convince an adjacent group—surgeons, say—that it was an expert radiologist.
New Python IDE is in town.. it's a paid IDE though... bummer.
Goose - Open Source, and locally run: your on-machine AI agent, automating engineering tasks seamlessly.
Cool concept if you are looking for a fun project. You can find the GitHub link in there so you can create one for your own life: My Life in Weeks
Want to be a freelancer? Here is how to do it.
How to change your settings to make yourself less valuable to Meta. Takes 2 minutes.
See you next month everyone... also share this with your friends... pretty please! ❤️
By the way, I teach people how to code and get hired in the most efficient way possible as an Instructor at the Zero To Mastery Academy. You can see a few of our courses below or see all ZTM courses here.