Aaron Decker

Aaron Decker - Engineer & Entrepreneur

My survey of the current state of AI in terms of investing in January 2026.

The state of AI investing: Jan 2026 edition

aiinvestingmarkets

attempt 2 state of ai 2026 header Generated image with latest ChatGPT image model

Overview

First, I will break down the broad themes & bottlenecks going on. Then I will give you some vocab to learn. After that I will talk about various companies, including the current state of the big AI labs. Lastly I’ll talk about stocks.

Disclosure: Not investment advice. I may own some of the tickers mentioned, I call them out at the end. Items marked with * are reported/rumored/estimated (not confirmed).

Related posts:

Bottlenecks approach

A bottleneck is the limiting factor in a system. If you fix a bottleneck, it allows more throughput, but remember: it’s always true that a new bottleneck will appear.

There is a race on to build AGI (artificial general intelligence) & ASI (artificial super intelligence) because it’s believed that whoever obtains it first will be able to command a lead that exponentially grows. See the background reading you should do at the bottom of this post.

If we are really shooting for AGI & ASI there will be many cycles and phases. One bottleneck will be solved as new chip fabs are built and then something else will be a constraint. Additionally, algorithmic advances and demands from robotics will enable new AI-enabled applications.

Current bottlenecks

  1. Semicaps: I read a couple of things that make sense to me related to “Semicap” (semiconductor capital equipment) being a bottleneck for the next 2 years or so. Mostly this comes from Citrini & Irrational Analysis. These are the machines used to actually make chips and they are installed in chip fabs. It seems like this is the next pressing bottleneck.
  2. Memory: HBM memory, NAND memory, hard drives, all of this is the latest bottleneck area. Every stock related to memory has already doubled, tripled, or quadrupled in price since the summer. But the shortage in HBM will not end until potentially 2027, so there is no reason these stocks should stop going up any time soon.

Other themes

Silicon photonics

Silicon Photonics - i.e. using lasers with silicon. This is another hot theme and there are various applications. One is higher speed data transfer (light is faster than electrons through copper) and the other is doing matrix multiplication called photonic accelerators or optical neural networks. This is referred to as a silicon photonics interferometer mesh. The idea is you could do some matmul inference operations without heat because you do it using light. It’s not in production yet anywhere as far as I can tell so the main applications in photonics are related to switching technologies and data transfer at the current time.

Power

Another theme is power: the grid in the USA is old and complex to deal with due to regulations and bureaucracy as well as lead times to actually build new power plants. Nuclear is a long-term angle, and the price of uranium has increased a lot in recent years, but nuclear takes a long time to permit and build. The main US source is natural gas: GE Vernova makes high-efficiency gas turbines (https://www.gevernova.com/gas-power) where they convert natural gas to electricity at 60% efficiency. This is how most new power plants in the USA work today, replacing coal and operating cheaper with high efficiency. The USA has centuries worth of natural gas and the prices should remain low due to advances in fracking technologies. The second-largest new source of power is solar.

However, the power situation has been obvious for more than a year, so all of these stocks already got bid up high. What is new is that the AI datacenter builders are giving up on the grid and literally building their own on-site power plants. So anyone that can theoretically build a turbine now is building one since GE Vernova & Siemens have backlogs measured in years. Solar is another solution, but this requires massive battery systems (which Tesla & others are providing). Other solutions that can be delivered now are fuel cells (which use methane - natural gas) - and that’s the Bloom Energy story.

Neoclouds

Neoclouds - this one is not new, but it’s hated by many. There are companies building datacenters for AI chips that are smaller players (not Google, Amazon, Meta, and Microsoft). The big neoclouds are Coreweave, Nebius, and Iren (I say “big” but they are 1/100th the size of the big companies, they are underdogs). The criticism is that the chips depreciate quickly and all of these companies will go bankrupt. Maybe, but maybe not: the demand seems very high for AI datacenters, even “old” chips are still being used. I still personally think they probably represent some of the highest risk but highest upside in the market.

AI second-order effect capabilities

The second-order effects of the new AI capabilities are hard to model, but in general, there are a lot of new things that are possible way cheaper than before, and some impossible things are now possible.

Robotics

Many robotics applications are powered by AI already - one example of this is robots using machine vision for pick-and-place processes in warehouses. The new vision capabilities alone should yield an explosion of robotics applications. This stuff is all a little early, but there are many companies racing to develop humanoid robots that can do many common gross and medium-fine motor skill operations in homes, warehouses, kitchens, and factories.

Physics, chemistry, and biochemistry

AlphaFold is a new AI program from DeepMind (Google AI) which can predict 3D protein structures as well as experimental data can. Demis Hassabis (current head of AI at Google DeepMind) and John Jumper shared one half of the 2024 Nobel Prize in chemistry. In addition to advances here, many other simulations should be possible to build from larger sets of training data. There are also a couple of companies that aim to scale up real wet lab chemistry where AI-designed experiments can be carried out by robots in massive parallelization. These are basically all private today and very speculative. Some of them, for example, have goals to use AI to search for new superconductors.

AI datacenters in space

This is an interesting angle but it’s not going to happen that quickly. But the idea is that if you launch the chips into space and run them on solar in sun-synchronous orbit, the panels can always face the sun and be powered constantly at high efficiency & you dissipate the heat off the backside of the panels with radiators. For AI training, latency is not an issue. This idea 100% hinges on SpaceX Starship being successful and them getting launch cost below $200/kg, which is actually possible if Starship is as reusable as they believe it can be. This is attractive because it solves regulatory issues on the ground + power, but there are a ton of new engineering challenges. This company has a white paper you can read if interested: https://www.starcloud.com/

Cybersecurity

A lot of attacks are possible to automate with AI at massive scale, and therefore more powerful defenses will be needed to fend off AI-powered exploits. There will probably be some cybersecurity companies that benefit from this.

Important vocab to memorize

  • AI training: the process of training a new model to predict outcomes.
  • AI inference: the process of using a trained model to predict outcomes from a given input. (This is what you do when you talk to ChatGPT - it’s running “inference”, i.e. predicting what the output should be.)
  • Matrix multiplication AKA matmul: it’s a simple mathematical operation. Current AI neural networks work by doing many trillions of these in parallel at unimaginable speed and scale.
  • GPU: graphics processing unit. They happen to have many small cores you can program in general and this is good for matrix multiplication, but are flexible enough to do a lot more than just simple matmul.
  • ASIC: application-specific integrated circuit. If a CPU can do any computation, an ASIC can only do a specific computation but MUCH more efficiently and faster, since it’s etched permanently in silicon. However they are very inflexible and useless if you change what you are doing too much. E.g. they are great if you make one for mining Bitcoin since the SHA-256 hash algorithm never changes. Riskier for AI since we change how we do stuff frequently as new algorithmic advances are discovered.
  • TPU: Tensor processing unit - these are the sort of a middle ground between GPUs and ASICs. Google custom-makes these, contracted directly from Broadcom + TSMC. Google actually doesn’t use a ton of NVIDIA chips like everyone else.
  • HBM: high bandwidth memory - it’s a special kind of super-fast RAM that GPUs and TPUs use.
  • NAND: permanent storage but done in silicon. Think flash drives, SSD storage etc.
  • LPU: a special kind of matmul accelerator designed specifically for AI inference that doesn’t use HBM; it uses SRAM, which requires many chips to run in parallel with the model being sharded. Can achieve higher throughput. NVIDIA just bought Groq, the company that invented this technology.
  • Semicap: semiconductor capital equipment.
  • WFE: wafer fab equipment.
  • Parametric yield: what % of chips meet specified performance targets. Different from “Does it work” yield.
  • PPA: power, performance, area. 3 metrics that matter for process nodes.
  • Process node: a specific recipe for manufacturing chips. Steps + material + equipment settings + design rules.
  • “nm naming”: used to be nanometers of the feature size of the process. The correlation broke down at 28nm; now the numbers are marketing. TSMC 3nm process does not have 3nm features. It’s simply what came next and not tightly correlated.
  • CoWoS: chip on wafer on substrate. It’s TSMC’s advanced packaging tech that lets you put multiple chips (chiplets) side by side on a silicon interposer and connect them with high bandwidth.
  • Silicon interposer: thin slab of silicon that acts as a wiring layer between chips. Facilitates communication between HBM and chiplets on a single GPU die. Traditionally this would have been done with literal wires, but now we hit a density and bandwidth requirement that means it’s done inside of silicon.
  • VRMs: voltage regulator module. Power delivery for chips.
  • Metrology: measurement - literally just measuring stuff during manufacturing but they are doing it at nanometer scale so it’s very hard.
  • OSAT: outsourced semiconductor assembly and test. They take the finished wafers from TSMC, Samsung, and Intel, and they do the advanced packaging + testing services.
  • Timing chips: every electronic system needs a clock signal to sync operations. Traditionally this was a quartz crystal.
  • CPO: copackaged optics.
  • OCS: optical circuit switch.
  • VLA: vision-language-action, a new approach to controlling robots with AI models.
  • COT: chain of thought, as in how the new reasoning models work. Chain-of-thought prompting existed well before December 2024, but “reasoning models” (as a mainstream product category) really took off around the time OpenAI released O1 preview. COT uses many more tokens than instant response models.
  • Tool calling: the ability of a language model to know about a tool, call to use it, and use the response and move to another action. This was largely made possible by function-calling interfaces.
  • AI agent: a long-running AI task that acts as an agent to solve a goal. Today these are powered by COT reasoning models which may do multiple tool calls and reason about responses. Today some agents can work on a single task uninterrupted for hours. For me, the longest I have ever seen a coding agent work on a single task is about 30 minutes straight last month.
  • MOE: mixture of experts. Mixture of Experts is an architecture where instead of running every input through the entire neural network, you have multiple specialized “expert” sub-networks and a “router” that decides which experts to use for each input. A lot of the new larger models use this approach for cost reduction.

Brief overview of semiconductor manufacturing process steps

  1. Design - NVDA, AMD - they design using EDA software like Cadence and Synopsys.
  2. Wafer fabrication - TSMC does 1000+ steps to deposit material on silicon wafers, pattern it with lithography (e.g., ASML scanners), and etch it with dedicated etch tools.
  3. Wafer Probe + Testing - before they cut the full wafer up (typically a 300mm / 12-inch wafer), they test the die while on the wafer. Probe cards touch the test pads. Bad dies get marked and they record yield at this step.
  4. Dicing - they cut up the wafer into dies. Good dies move on.
  5. Advanced Packaging - They fab the interposer, they add HBM stacks. Bond using micro bumps (tiny solder balls that number in tens of thousands).
  6. Final test - they test the final package with something called “burn-in” where they run at elevated temp + voltage to stress them. This is so failures happen here, not with customers. They functional test all features, and then they speed-bin - the chips are sorted based on how fast they actually run on real test results. (Multiple SKUs can share a die design but be binned differently at this stage.)
  7. Module assembly - chip is mounted to final module board, power VRMs added, connectors attached. Ships to customer after power testing.

The big labs doing SOTA (state of the art) AI models

  • OpenAI - You know them: they make ChatGPT! ChatGPT has 800 million weekly active users* making it one of the largest apps on the planet. They are rumored to go public this year or next.* The closest public proxies are Coreweave (which they have deals with via Microsoft), Oracle (involved in Stargate heavily), and AMD (they have an interesting deal with AMD* that would substantially impact stock price). OpenAI is deeply tied to Microsoft and Microsoft owns like 30% of OpenAI.* Their valuation is at $800 billion?*
    • OpenAI uses Microsoft and Oracle datacenters as well as Coreweave.
    • Currently GPT 5.2 PRO leads on benchmarks for absolute problem-solving capabilities and long-running task ability. OpenAI pioneered the COT reasoning paradigm that is SOTA right now with the most powerful models.
  • Anthropic - they make Claude and they are doing very well. Their valuation is maybe at $400 billion?* They are said to be going public this year.* They are deeply tied to Amazon, but we don’t know how much Amazon owns of Anthropic. They also have some deals with Google. They are taking a more B2B / enterprise approach compared to OpenAI.
    • Anthropic is known to use Amazon datacenters & Amazon’s custom Trainium chips.
    • Anthropic is interesting because they focused solely on coding, research and chat. They don’t even have a SOTA model that can generate images, audio or video like the other big labs do.
  • Google - Google scientists invented the transformer architecture that made the current AI race possible, but they were behind all of 2024. They recovered and caught up in 2025. But Google seems to have issues with consumer products, and Anthropic is winning with programmers now. So what will Google do? They are very good at AI and basically the assumed eventual victor because of their incredible resources and deep experience with AI.
    • Google is unique in that they have their own datacenters, their own custom chips TPUs, and incredible resources (both monetary and human). Again, they are the assumed eventual victor and NOT the underdog.
    • Google is in a weird situation because using AI is better than Google Search and Google Search volume is starting to rapidly decline.* But they also have SOTA AI and Gemini is popular; however, it makes no money. They will probably be the first to add advertisements into AI interfaces.
  • Meta - the parent company of Facebook and Instagram. They had a lot of upheaval because Llama 4 was an embarrassment and failed to maintain SOTA on benchmarks. Zuckerberg fired everyone and hired a whole new AI team basically. We don’t really know what they are doing and it’s unclear what their plans are but they have a ton of money and datacenters. It’s assumed they are in the race for SOTA models. They need AI internally to make ads and the feeds work, and all consumer features are useful to them.
  • xAI - the most interesting situation is xAI - they now own X (formerly Twitter) and Elon Musk is CEO. Musk actually co-founded OpenAI and gave the first $100 million as a donation to start the company. Somehow OpenAI became a for-profit company and Musk is very salty about it and hates Sam Altman and OpenAI. xAI is like a spite company at first glance, but I think Musk is so ASI-pilled he cannot help but be trying to build ASI.
    • xAI is legitimately making SOTA models. They did build a massive datacenter very quickly and Musk did hire top-tier AI researchers. They have good models for language, vision, images, audio and video, which is wild because they started later than everyone else.
    • The question is longevity - how will they make enough money to keep in the race? Google, Microsoft, Amazon, and Meta basically have infinite money compared to xAI which is having to raise it from investors. The non-big-tech labs are very closely tied with rich benefactors (OpenAI → Microsoft and Anthropic → Amazon). But xAI does not have this. Twitter (X) is not very profitable and in debt.

Private companies applying new AI capabilities

These are some of the most interesting things going on potentially because they are trying to push the boundaries of the new AI models and new concepts into different or non-obvious fields.

None of these are investable unless you are a big VC firm, but you should know they exist.

I will highlight a couple: Project Prometheus, which is a secretive new Bezos company that doesn’t even have a website yet. Dynatomics is a secretive Larry Page (Google co-founder) project for AI-based “product manufacturing”. Periodic Labs is stacked with top-tier talent and is trying to do material discovery with AI. Physical Intelligence is also a talent-stacked company building new foundation models to power robots. There are also many AI-assisted drug discovery companies in the list below.

NameWebsiteWho?WhatFunding
General Intuitiongeneralintuition.comworld modelshundreds of millions
Project PrometheusJeff Bezosmanufacturing(stealth)
Harmonic AIharmonic.funVlad TenevMathhundreds of millions
DynatomicsLarry Pageproduct manufacturing(stealth)
Physical Intelligencephysicalintelligence.companyFormer DeepMind peoplerobots & world models600 million
Skild AIskild.aiFormer Metarobots300 million Series A
World Labsworldlabs.aiFei-Fei Liworld models230 million
AMI Labsamilabs.ioYann LeCunworld models500 million
Safe Superintelligencessi.incIlya Sutskeverunknown5 billion
Thinking Machines Labthinkingmachines.aiMira Muratiunknown2 billion
Figure AIfigure.aiBrett Adcockrobots5 billion
Tenstorrenttenstorrent.comJim Keller (legendary chip architect - Apple, AMD, Tesla, Intel)RISC-V chips$693M Series D
Lightmatterlightmatter.coNick Harris (CEO), Darius Bunandar, Thomas Grahamphotonic computing - “Passage” photonic engine enables 1,024 GPUs working synchronously; CEO says “probably our last private funding round” - IPO likely400M
Liquid AIliquid.aiRamin Hasani (CEO), Mathias Lechner, Alexander Amini, Daniela Rus (MIT robotics luminary)Alternative to transformer architecture - smaller memory footprint, runs on-device, needs less compute. “Liquid Foundation Models” (LFMs) for edge deployment293M total
Xaira Therapeuticsxaira.comMarc Tessier-Lavigne (former Stanford President, former Genentech CSO) Co-founder: David Baker (UW Institute for Protein Design, Nobel Prize 2024)AI-powered protein/antibody design using RFdiffusion and RFantibody models1 billion Series A (2024)
EvolutionaryScaleevolutionaryscale.aiAlexander Rives (led Meta’s AI protein team before 2023 layoffs)ESM3 model - “simulate 500 million years of evolution” - generated novel green fluorescent protein142 million seed round June 2024
Shield AIshield.aiBrandon Tseng (former Navy SEAL), Ryan Tseng (former Qualcomm engineer)“Hivemind” AI pilot software for autonomous drones/aircraft (works without GPS/comms)1.3 billion
Isomorphic Labsisomorphiclabs.comGoogle DeepMind spinoutAI drug design using AlphaFold technology for protein structure prediction600M Series A (2025)
Chai Discoverychaidiscovery.comJosh Meier (former OpenAI, Meta FAIR)Foundation models for drug discovery - predicting/reprogramming molecular interactions… Chai-2 model achieved 16% success rate in de novo antibody design (100x+ improvement over prior methods)200M total
Lila Scienceslila.aiGeoffrey von Maltzahn (Flagship GP, also co-founded Generate Biomedicines) Chief Scientist: George Church (renowned geneticist, Harvard)“Scientific superintelligence” - AI platform and fully autonomous labs for life/chemical/materials sciences435 million
Periodic Labsperiodic.comLiam Fedus (former OpenAI VP Research, key ChatGPT architect), Ekin Dogus Cubuk (former Google DeepMind research scientist)AI materials science - one of largest seed rounds ever for materials science… seems to be superconductors discovery focus200 million
Profluent Bioprofluent.bioAli Madani (CEO)Frontier AI models for protein design - first CRISPR system created from scratch with AI150 million
Generate Biomedicinesgeneratebiomedicines.comMike Nally (CEO)Generative biology - AI-designed protein therapeutics693 million total
Insilico Medicineinsilico.comFounder: Alex ZhavoronkovFirst fully AI-discovered drug to enter Phase 2 trials (Rentosertib for idiopathic pulmonary fibrosis)500 million+
Cellarescellares.comFabian Gerlinghaus (founder, CEO)automated cell-therapy manufacturing (robotic factories for biology)$255M Series C (Jul 2023) (BioPharma Dive)

Humanoid robots

This deserves its own section because if it works there are endless applications for labor automation. To be extremely clear, there are a ton of engineering challenges around even making these work right now. We are a few years off before mass deployment and a lot of what you see today is teleoperated demos (humans controlling them remotely).

Teleoperated humanoid robots are actually better than it sounds. There are many applications where it still makes labor cheaper because you can either have the operator in a cheaper COL country OR they may be able to control 2 or 3 or more robots simultaneously, only jumping in for interventions. It’s a stepping stone to full autonomy embodied AI.

There are a couple of smaller interesting companies not on this list. One is Sunday Robotics which has a unique low-cost design for the home. Another is 1X Robotics NEO, which is taking preorders now and meant to be a home assistant (they say they will deliver units this year!).

It’s definitely possible these will ship this year… but will that take 12 hours to do the dishes and will they break half of your glasses in the process? I am very skeptical today of the motor control abilities and speed.

RankCompanyCountryWebsiteValuation/Market Cap*Units Deployed*Est. Mass Production DateNotes
1Figure AI🇺🇸 USfigure.ai$39B (private)~50-100 (BMW Spartanburg)Now (2025)Figure 02 fleet working full shifts at BMW; Figure 03 for home unveiled
2Tesla (Optimus)🇺🇸 UStesla.com~$1.2T (TSLA)~5,000-10,000 internalLate 2026Targeting 1M/yr capacity by end 2026; Fremont pilot line active; consumer version $20-30K by 2027
3Unitree Robotics🇨🇳 Chinaunitree.com$7B (targeted IPO)1,000+ (G1/H1)Now (2024)Claims profitable since 2020; G1 at $16K; IPO tutoring completed
4UBTECH Robotics🇨🇳 Chinaubtrobot.com~$6.8B (HK:9880)500+ delivered (2025)Now (2025)Walker S2 with auto battery swap; 1.3B yuan in orders; 300/month capacity
5AgiBot (Zhiyuan)🇨🇳 Chinaagibot.com$6.4B (private)5,000+ (as of Dec 2025)Now (2025)Ex-Huawei founder; A2 model; HK IPO targeting 2026
6Rainbow Robotics🇰🇷 Korearainbow-robotics.com$5.8B (KOSDAQ:277810)Pilots only2026Samsung strategic partner; HUBO heritage
7Agility Robotics🇺🇸 USagilityrobotics.com$1.75B (private)~100 (Amazon, GXO, Spanx)Now (2024)Digit moved 100K+ totes; RoboFab targeting 10K/yr; first RaaS humanoid deal
8Boston Dynamics🇺🇸 USbostondynamics.comHyundai subsidiaryR&D/demos only2026-2027Electric Atlas (2024); iconic but no commercial humanoid deployments yet
9Apptronik🇺🇸 USapptronik.com~$350M+ raisedPilots (Mercedes, Walmart)2026Apollo robot; Google DeepMind partnership; CES 2025 demos
10Fourier Intelligence🇨🇳 Chinafftai.com~$500M+ (SoftBank-backed)100+ (GR-1/GR-2)Now (2025)Rehabilitation robotics origin; $17M China Mobile contract with Unitree

Stocks

Semicap (smaller names)

I’m only focused on the smaller companies because those are the ones where the stocks have potential to run hard and maybe valuation is lower. The biggest companies like ASML, KLAC, AMAT, etc are obvious to everyone; also, (some) have like 50% China revenue exposure, and are very large market cap and already at high prices.

  • CAMT - Camtek - inspection during packaging, checks bump alignment + interposers. Israeli advanced packaging company. For the next 2 years this is a bottleneck. CoWoS expansion. Interesting because they can probably sell worldwide (into China) without US restrictions on US companies.
  • ONTO - Onto Innovation is similar to what CAMT does, but they are U.S.-based and heavily exposed to auto industry which is in a soft period. Their stock has not run much.
  • TSEM - Tower Semi - specialty foundry that does analog chips, RF/wireless, sensors, and importantly silicon photonics. They are using older process at larger nm (65nm, 45nm). Not cutting edge but essential.
  • AEHR - burn-in system equipment for stressing packaged parts.
  • TER - final test systems. Also has a lot of exposure to robotics.
  • NVMI - metrology, measure film thickness, feature dimensions.

Tangential to semicap

  • Amkor - OSAT provider, they have been squeezed out by TSMC for the high end AI stuff. But maybe they won’t be in the future because there is literally too much for TSMC to do it all. They are known to do packaging for Micron.
  • ASE Technology (ASX) - largest OSAT provider in the world. Overflow thesis also applies.
  • SiTime - they make MEMS-based timing chips which is silicon that replaces quartz crystal timers.
  • Keysight (KEYS) - they make electronic test equipment: oscilloscopes, signal analyzers, network analyzers. It’s more R&D equipment, but has a tailwind from all silicon investment.

Things to watch on photonics

  • LITE (Lumentum) - they are known for optical switching. So they make lasers + photonics components. They also make 3D sensing lasers (used by FaceID in iPhones).
  • CIEN (Ciena) - optical networking equipment. They make complete optical networking systems. I.e. they buy Lumentum components and package them into final pieces of equipment. A bit more exposed to telecom carriers. Telco spending has really slowed (it’s already built!).
  • BE - Bloom Energy makes methane fuel cells. They are becoming very useful to power non-grid connected datacenters. They are nice because they bypass all emissions regulations since they only emit water + CO2, no nitrogen pollutants.

Major players

  • INTC (Intel) - America’s only cutting-edge fab company that has been struggling for a decade. After many missteps maybe Intel is making a comeback. Or maybe it’s just that the government will always step in to help because if they go bust it’s a major national security issue. Whatever you believe a lot of people are betting on a comeback.

My current book (semis)

  • MU - HBM memory shortage
  • SNDK - NAND memory shortage
  • SK Hynix - HBM memory shortage
  • TER - mentioned above
  • ONTO - mentioned above
  • AEHR - mentioned above
  • AVGO - Broadcom they manufacture the TPUs for Google and will make other ASICs (which compete with NVIDIA GPUs).
  • AMD - GPU maker & OpenAI proxy
  • NVDA - GPU maker
  • NVTS - they make power semiconductors using GaN and SiC. They compete with ON Semi and Texas Instruments.
  • ASX - ASE Technology does OSAT

My current book (non-semis)

  • GOOGL - obvious
  • MSFT - also obviously an OpenAI proxy
  • AMZN - it’s simply the Dutch East India company of the current era. They also make their AI chip btw.
  • RKLB - datacenters in space, need I say more?
  • META - they will survive somehow
  • CRWV, NBIS, CORZ - I think people are too bearish on neoclouds
  • TSLA - Tesla is no longer a car company. 20%+ of their revenue last quarter came from power systems; this will increase a lot. They also have AI self-driving taxis now.

Things I plan to add

  • CAMT
  • TSEM
  • ASX
  • BE (?)
  • Anthropic whenever it IPOs

Other background reading you should do

  1. AI 2027
  2. Situational Awareness
  3. Ray Kurzweil, The Singularity is Nearer
  4. Machines of Loving Grace by Dario Amodei (CEO of Anthropic)

Key risks

  • Power / interconnect bottlenecks last longer than expected (grid queues, transformers, permitting).
  • AI spend slows (macro downturn, CFO caution, or model ROI disappoints in enterprise).
  • Supply catches up faster than expected (HBM, packaging, networking), compressing margins across the stack.
  • Policy shocks (export controls, national security, antitrust).
  • Software-side commoditization (open-source models + falling inference costs) shifts value capture away from some “picks and shovels”.

Conclusion

I know this was long but I hope it was helpful. If anything, it’s a snapshot of where we are in January 2026 and what themes have been popular lately in AI.

I will say there has been a lot of back-and-forth this year about timelines. After GPT-5 was released it felt like people were very disappointed for a while. But now that we have Gemini 3, GPT 5.2 and Opus 4.5, it feels like we ended 2025 on a high note.

The coding agent capabilities that we got by the end of 2025 are nothing short of mind-blowing. If we even get half the rate of progress in 2026 that we got in 2025, we will have a very different world when it comes to knowledge work.

Now, when it comes to some of the other stuff like robotics, materials science, and drug discovery, it’s still very early days. The main takeaway on these fronts is that progress is being made but you know, it could be 2030 before you see robotaxis & drones everywhere & humanoid robots in frequent use.