#26 — Context-engineering is the new essential skill
Don't expect AI to read your mind
Hello, friends! 👋
I’m taking a break from this intense 6-month period. Computing Education Things has grown from 0 subscribers to nearly 100, and I’ve been able to talk with several CS educators I admire. I hope you enjoyed the 26 issues I brought to you each Friday!
Thanks for reading (and listening) along in 2025! Please hit reply and let me know what you like, what you dislike, and what I could do better next year.
I’ll be back on January 9th. Until then, I’m sure you’ll find something interesting in the archive.
Merry Christmas and Happy New Year to you all!
The shift to AI-Native Engineering
Addy Osmani is becoming the go-to person when I’m looking to dive into the topic of AI-assisted engineering. I’ve been watching his talk at JSNation and he says very interesting things if you’re interested in AI-Native Engineering.
Addy starts by saying that while it’s clear we’re experiencing a transformation in software engineering, it’s by no means a replacement, but rather a new paradigm is emerging that he calls AI-native engineering, where pure human effort disappears and what we have are agentic workflows where software engineers become orchestrators. Examples: Claude Code for web, Cursor Background Agent, Jules by Google, GitHub Copilot agent, Conductor for Mac.
This “era of orchestration” needs the software engineer as a conductor guiding the agent or as an orchestrator directing fleets of agents working in harmony. Orchestration is at a very experimental point, where startups seem to be experimenting more than the corporate world, which appears more skeptical at the moment.
In this new workflow, AI should really help not so much with writing more code faster but with building better software, whether in the design phase for rapid prototyping, or in the inner loop (coding, testing, debugging).
While vibe coding prioritizes speed and experimentation over rigor, AI-assisted engineering is emerging as that methodology where the human remains in the loop and where it’s essential to provide the right context because AI doesn’t pick up on key details, is too chaotic (lacks structure), etc. So the software engineer equips the AI with the necessary context (at the codebase level, constraints, problem definitions...). Within this new paradigm, spec-driven development emerges where you plan before the prompt and provide the model with clear context rather than expecting AI to read your mind, and you feed the model by building that “external brain” that improves over time because agents store key insights after tasks (this idea of learnings.md).
Very interesting: the golden rules for deploying AI-generated code to production. Specifically, Addy mentions:
1) Never commit code you can’t explain (to others or yourself)
2) Human-in-the-loop remains critical - expect to manually change things
3) Reading code becomes a bigger part of the job - more time reviewing, testing, explaining than typing
4) Tests are your safety net - de-risk AI coding by catching issues early
Regarding human supervision, AI seems to need it more in some contexts than others. Where it doesn’t seem necessary is in greenfield code, boilerplate generation, generic functions, unit tests/docs, prototypes/MVPs, well-defined tasks with predictable patterns. On the other hand, AI struggles with high-context novel problems like deep architectural decisions, informal business logic, security-sensitive areas, ambiguous requirements, and logic spanning multiple systems.
Regarding productivity, Addy presents positive data with clear gains in terms of completed tasks or merged pull requests, but system-level bottlenecks also emerge: PR review time increases, PR size increases, AI generates changes to unnecessary files, AI suggestions waste time, and debugging AI output is costly in time and resources. As you may have noticed, the paradox is that we sped up code generation but human verification doesn’t scale at the same level.
Another problem that comes up in this talk is trust in the accuracy of AI-generated code.
The 70% problem is also present: while AI gets you about 70% toward solving a problem, the last 30% is the hardest (edge cases, architecture, tests, cleanup...). This problem is especially critical for seniors where it represents a slowdown and probably in some cases they would have been faster writing it from scratch. For juniors this 70% feels magical but they may be pushing the other 30% to code review where seniors will have to clean up mistakes (no human would make?).
Another topic Addy discussed was the impact on hiring and juniors, where AI is commoditizing easy tasks and entry-level roles may shrink or evolve to focus on what AI can’t do well. The counterpart of this reduction in junior talent is that we need to feed the talent pipeline to have juniors who become seniors.
Addy also gives some recommendations to mitigate skill erosion: 1) turn code reviews into learning moments (ask good questions forcing others to explain the AI output, focus on comprehension not just correctness, turn it into mentorship sessions) 2) trio programming (keep expert in the loop with senior + junior + AI), 3) regular No-AI challenges (designate days/tasks without AI, build resilience for when AI fails), 4) The human as guarantor of quality standards, robustness, etc.
It seems that voice dictation is a new interaction style that’s gaining traction (Addy says you type 3-5x faster though terrible if you’re in the office, of course). Lastly, Addy references Chrome DevTools MCP and gives the example of Shopify and its company-wide integration policy.
The final conclusions: keep the human in charge, letting creativity, judgment, and understanding guide AI use, and you need to invest in tests and CI (before expecting big gains), clear docs, quality and validation processes, and mentorship/learning systems.
We need to talk about this "College is useless" take
This week’s issue of Josh Brake settles the debate: what the post is saying is that if you approach college with a transactional mindset, sure—but this tweet from Guillermo (whom I respect a lot professionally) misses something fundamental: students are shaped by their college experience in many ways (deep questions, discerning their place in the world, building skills…). Not only that, if anyone can do anything, that’s precisely why college is more valuable than ever: college doesn’t just teach you the what and the how—it’s about figuring out the why. And let’s be honest, while the barrier to building a web app on Vercel has gone to zero, let’s not pretend that the most challenging problems in the world can or will be solved with web apps.
Related to this topic, I really liked these words from David Escobar Arango (Comfama) on the afueradentro podcast about his life-changing experience doing his master’s at Harvard Kennedy School:
A space I had to write a lot, read a lot, connect with incredible people, and distill my ideals and dreams.
An AI engineer on what CS Education actually prepared him for
I really enjoyed this episode of the previously mentioned afueradentro podcast with Alexander (Johnny) Montoya, who is an AI Research Engineer at La Haus.
I learned a ton and I think Johnny manages to make topics approachable that could be intimidating if you’re not in this world. Here are some ideas about the topics they covered:
Johnny believes that with AI we’re thinking at a higher level, thinking about what tasks we’re going to assign to it—it’s like being given a bigger team. This leads, he says, to developing other skills like communicating well, knowing how to orchestrate, delegate... He warns that this new workload is also mentally demanding; we feel more mental fatigue.
Johnny encourages understanding the technology, seeing the good and the bad, not just pigeonholing yourself into the negative and starting to criticize, but understanding its limits. What he says about his CS degree is interesting—that he learned CS is a tool that can be applied in any form in the real world—and he also highlights the value of learning in college.
How even something that seems so logical like AI and the search for knowledge ends up being an act of presence, of being very connected with everything that’s happening in this moment, of all the signals converging and it being inevitable to feel it and be it all.
Beyond the topics, I felt very connected with Johnny on a personal level. I’m also a software engineer and I’m someone who never stops learning, who’s always there taking the pulse of what’s happening in this computing education world, and who also complements this abstract and logical world with other more humanistic interests.
Also, from the host Jorge, I learned two criteria that I’m going to try to apply to my podcast (although I think I was already doing it unconsciously): one is that the guests know what they’re talking about, whether it’s an experience, something they lived through, a field of knowledge—that they’re an expert in something. And two, that they know how to communicate it, that they know how to express it, that they captivate you. I like that approach of looking for voices that make you say “it grabbed me, it caught me, it hooked me.”
In this conversation between two friends, I also felt how it flowed little by little. I’ll be following this podcast closely to find more people who share with that power those things that interest them and excite them.
🔍 Resources for Learning CS
→ My updated technical reading list
Let’s give books this Christmas. I’ve reorganized the sections of my technical book list with new subtopics to make them easier to find. Years of constant curation went into this. You can take a look here.
→ The power of side projects
Finding a job is tough right now, and building side projects that genuinely interest you seems more effective than sending a resume identical to everyone else’s. In this post, Abigail Haddad shows how building small, interesting things can beat traditional applications, especially in data roles—side projects that not only open doors but also improve real skills. Great advice.
→ Data Analysis in your browser
FlowSQL is an in-browser SQL environment for CSVs and SQLite files. Privacy-focused—no server, no login, data stays local. Includes query history, light/dark mode, and one-click exports. If you want to run R, use webr.sh instead.
→ AI Tools for technical interview prep
For those practicing for technical and behavioral interviews, or also interesting for faculty who are curious about the current AI tools companies use to interview CS students:
1) Cluely 2) LockedIn AI 3) Interview Sidekick 4) Final Round AI 5) Cursor AI IDE 6) Claude Code 7) CodeSignal Learn + AI Reviews 8) HackerRank + AI-Assisted Practice 9) AlgoCademy 10) LeetCode Premium + AI features
🔍 Resources for Teaching CS
→ Gen-AI and MLOps lectures from Northeastern
Ramin Madi teaches Gen-AI, MLOps, ML, and NLP at Northeastern, and his lectures bring real-world experience. If you're looking for practical ideas for Spring 2026, this is a great resource.
→ Teaching transformers in Spring?
This article offers a technical deep-dive into how transformer-based architectures are being deployed in production autonomous vehicles. A good example for courses on advanced DL (transformer applications), Software Architecture (modular vs. monolithic), or real-time systems. The article combines technical rigor with real-world engineering decisions.
→ A practical textbook on database design
Practical textbook that teaches students how to bridge the gap between business requirements and database implementation—formalizing requirements into structured logical models and systematically converting them into SQL tables. Perfect for helping students develop the critical thinking skills needed to translate real-world problems into robust database designs.
→ BDA and Geographic Data Science course materials
Aalto's Bayesian Data Analysis (Aki Vehtari) delivers the full Gelman et al. workflow with lectures, assignments, and code all available. For spatial computing, this University of Liverpool's Geographic Data Science course (Drs. Pietrostefani & Cabrera) bridges GIS and data science through practical R and Python labs covering vector/raster data, spatial weights, ESDA, clustering, and networks.
🦄 Quick bytes from the industry
→ Beyond the hype and Back to Research
I finally got to sit down and calmly watch this conversation between Ilya Sutskever and Dwarkesh Patel. Like the previous interview with Karpathy, I think it’s a good time to talk about AI’s limitations amid the hype. Specifically, he refers to topics like the disconnect between the fact that models are doing so well on evals and economic impact lagging so far behind, model brittleness, the problem with current RL approaches and model generalization, why it’s time for research again and what is research taste for him.
From 2020 to 2025, it was the age of scaling... But now the scale is so big... is the belief really that, oh, it’s so big, but if you had 100x more, everything would be so different?... So it’s back to the age of research again, just with big computers.
Different people do it differently. But one thing that guides me personally is an aesthetic of how AI should be by thinking about how people are. But thinking correctly... You kind of ask yourself, is something fundamental or not fundamental? How do you think it should be?... Beauty, simplicity. Ugliness. There’s no room for ugliness.
→ Boris Cherny's Career Path
One of my recurring podcasts of the year. Another glorious episode.
The one technical book I would recommend to everyone that had the greatest impact on me as an engineer is called Functional Programming in Scala. The way it teaches you to think about coding problems is such a change from the way that most people code, either practically or in school. It’s going to completely change the way that you code.
→ The skills that matter now
In line with my previous editions on taste and computational judgment, Chris Messina argues in this long post that GenAI has turned code into an abundant commodity, which doesn’t eliminate devs but rather shifts value toward human skills: judgment, taste, and orchestration ability (I also talked about this in another edition). Chris proposes three archetypes for thriving in this new world: the Mixologist (combines components), the Record Producer (orchestrates diverse talent), and the Architect (designs coherent experiences)—roles where humanity and judgment matter more than writing code.
🌎 Computing Education Community
Madison Thomas, a PhD candidate at NC State, needs help with her dissertation survey on cybersecurity program retention at CAE-affiliated US schools.
Microsoft Research NYC’s Computational Social Science group is hiring interns. Their work focuses on AI-based systems and their impact from individual (AI-augmented cognition/decision-making) to societal scales (agentic markets and information ecosystems). Application deadline: January 9, 2026.
Exciting opportunities to work on a new Discovery Project: Enhancing learner feedback literacy using AI-powered feedback analytics!
Fully-funded PhD position at UNSW Sydney in programming languages and formal analysis. Research may include developing new methods (type theories, program logics) to reason about security, privacy, verified compilation, or complexity analysis in functional, imperative, or probabilistic programs, including mechanization. The position needs to be filled ASAP. Interested candidates should contact the professor before formally applying.
Research study on what and how to teach about AI in the UK and Ireland (not about AI tools for teacher productivity). Seeking interviews with: undergraduate students who recently completed A-Level Computer Science, current 16-19 year-old students in computing courses, university professors teaching AI, AI industry professionals, and computer science teachers for ages 14-19.
Brandeis University’s Computer Science Department is hiring a full-time teaching faculty member (Lecturer or Assistant Professor level) starting Fall 2026. The position is non-tenure track with a salary range of $85,000-$115,000. They’re seeking educators to teach core CS courses plus upper-level/graduate electives, with particular interest in AI/ML expertise (deep learning, reinforcement learning, data-driven systems, and ethical/societal aspects of AI).
University of Nebraska is hiring an Assistant/Associate Professor in Software Engineering or Data Science.
Assistant professor position open in computer science and data analytics at Heidelberg University (Ohio) serving primarily first-generation college students.
Aalto University’s Department of Computer Science is hiring a tenure-track Assistant Professor in computer science. The search is broad and considers all CS areas, with particular interest in datacenter-scale computing systems including HPC, quantum computing, accelerator platforms, parallel/distributed programming, software design and optimization, energy-efficient computing, and HPC-AI integration. Application deadline: February 2, 2026.
🤔 Thought(s) For You to Ponder…
I loved this piece from Matheus Lima‘s blog on what actually makes you senior, and it connects beautifully with Eric Normand‘s essay on denotational vs operational thinking. Lima argues that senior engineers distinguish themselves not through checklists of skills but by their ability to reduce ambiguity through asking good questions that clarify problems, challenge assumptions, and create confidence before execution—rather than waiting for clarity or jumping straight into coding. Normand makes a strikingly parallel point: a key skill differentiating seniors from juniors is “the ability to separate implementation from specification“—juniors focus on getting something working (operational thinking), while seniors think about the meanings they want to represent apart from how to represent them (denotational thinking).
BTW, I find everything Eric Normand writes fascinating. Subscribe to his Substack here:
Interesting article by Paul Blaschko on workism: we sense that workism is wrong, but at the same time, it's hard to articulate why it's wrong when it generates genuinely good things. Besides, work isn't like eating a brownie—it's not just pleasurable, it also requires skills, virtues, and generates good friendships... One thing seems clear after reading the article: making work your primary source of meaning and identity doesn't mean it has to be the exclusive one.
In my case, I write by hand and use an LLM to check my work before it goes through human editing. I’m clear that I shouldn’t give up what I want and love to do: write and think about what interests and excites me. This piece in spanish strikes me as very timely. It shows in detail the situation of many students across different majors who have already given up and handed over their creative space to the prompt.
There’s a clear risk in leaving all initiative to AI: you lose control and replace the creativity that makes us human. It’s necessary to have a personal strategy about using artificial intelligence that keeps us in control. The cost of not doing so, as shown in this piece, is very high.
The wow factor, based on this handbook from MrBeast, is something that transforms what you’re watching into content you won’t easily forget. Do something that connects with who we are, what we want, and what we dream about—in other words, something that speaks to us. That ability to create content that connects, I think, is the key that makes us special compared to AI-generated content.
I came across this 2019 article in which Professor Ricardo Baeza talks about algorithmic biases and how acknowledging their existence is part of understanding how technology works. Without going into the details of each bias, the solution he proposes involves better problem definition, improved data, better attribute selection, and giving users more control. Biases are part of our nature; it's not possible to eliminate them completely, but it is possible to apply these solutions to reduce them and not perpetuate injustices at an algorithmic scale.
A good point from Tigran Sloyan (CodeSignal‘s CEO) that came in handy this week for reflecting on the state of technical interviewing in a GenAI world was:
“Just because AI can do it doesn’t mean you understand what it did, right? So, I’d actually like to understand, do you have the skills to know what’s going on by showing me that you can do it yourself... Similarly to how, like, just because calculators can do multiplication and division and everything, we still teach kids how to do multiplication and division. Because just because a calculator can do it for you doesn’t mean you don’t need to know how to do it yourself.”
I wrote here a few weeks ago about how we can’t lose sight of the fundamentals.
I discovered this video in the CIC newsletter.
📌 The PhD Student Is In
→ Grades are In
One of the requirements for continuing my doctoral contract is to obtain a cumulative GPA of 3.0 or higher. This week we received our grades here at UH, and for now, I can continue for another semester. I got “A” in Cloud Computing, “B” in AI and “S” in Doctoral Research. A “B” in AI may seem like a low grade, but believe me, this course cost me blood, sweat, and tears haha
🪁 Leisure Line
We just put up the Christmas tree and nativity scene. Definitely brings a little more joy and warmth to our home for the holidays (there are still a few little touches we’ll add over the next few days).


Christmas catch-up with my friend Trent from Rice! Keep up the great work in spring with the Thunder and with Rice Owls.
📖📺🍿 Currently Reading, Watching, Listening
I got Voices of the Saints after seeing it on Tsh Oxenreider’s 2025 Gift Guide. Looking forward to reading the 365 different saints’ stories, some of whom are household names but many of whom are less well-known. One thing I love about the book is that each entry includes an excerpt from the saint’s own writing or another relevant source, so I can receive a different saint’s wisdom, counsel, and encouragement each day in 2026. December brings gifts. Books are a great gift, and the guide above lists books that you might find interesting.
Have you heard about Telo trucks? They are cool electric trucks and they're ready for pre-orders.
Dallas is about four hours from Houston, so it’s not a bad plan to visit its new Netflix House during the Christmas break. Can you imagine entering an interactive world where series come to life? Doesn’t that sound cool?
Today I’d like to recommend Jon Batiste’s latest album. It’s called The New Americana Collection. It grabbed me from the first song. BIG MONEY is the single that brought me to it:
💬 Quotable
What really changed was how I made coding not only a thing to build new products, but the way I use my computer. Now I have servers providing information and doing stuff for me. My whole relationship with technology changed because of AI. It’s fascinating that a lot of people still don’t see this change coming, and it’s definitely redesigning how they work and how technology is getting into their offices and projects.
― Christian Van Der Henst, Managing Partner at Region Cuatro, Angel Investor, Co-founder of Platzi
As always, if you enjoy Computing Education Things, please like, comment, or share this post! You can also support this work through Buy Me A Coffee.











