Skip to main content

2 posts tagged with "Design"

View All Tags

· 8 min read
Carl Liu
Kazem Jahanbakhsh

Goal

Here we want to share some of our anecdotal observations of what’s happening in the internet market post ChatGPT. We want to discuss some of the market disruption that we have observed and our predictions for the future.

Long tail search market disruption

There are so many companies like CourseHero, StackOverflow, reddit, wikipedia, Quizlet, and Chegg that used to receive lots of organic traffic from Google. However, we have noticed these websites' traffic has gone down since the inception of ChatGPT. See this article on stackoverflow showing 14% traffic loss in 2023. Also see this article on Chegg stock. Our hypothesis is that lots of younger users who have adopted ChatGPT and other AI assistant tools visit that website to get answers to their questions as opposed to Google in order to see links.

We argue that Google traffic has gone down for these niche long tail searches. One anecdotal evidence for this is this article explaining the Code Red situation at Google. Also Google released an AI Overview for these niche searches in May. Note that they don’t really make ad revenue from this segment.

Some Predictions

  • LLMs are getting better and they will eat more queries/questions over time. That means all apps leveraging AI will keep growing and the Google market will keep shrinking. Innovator's dilemma!
  • Lots of internet companies have to think about what the internet is going to look like in the future otherwise they will disappear in 3 to 5 years.
  • Companies like OpenAI and Perplexity need to figure out how to turn this to a scalable business model. They might fail but regardless LLM which is essentially a compressed internet will not go anywhere. For example there might be a chance that a new company will figure out the business model outside of the existing AI companies.
  • Post ChatGPT the distribution channels are going to be different!

Scaling law and future of LLMs over the next year

A month ago I heard Elon Musk had built a cluster of 100k GPUs for the first time in Memphis. It seems no one could think this is possible including Microsoft. Some VCs claim this could put x.ai ahead of OAI. Apparently he also is working now on a cluster with 1mn GPUs. Some VCs are arguing that with these larger size clusters he might be able to get ahead of OAI and beat GPT-4 over the next few months with Grok 3. There are obviously so many other variables including if the scaling law really holds as well as if his team is able to pull this off. Interestingly I have seen in their job postings for x.ai they are saying they use JAX & Pytorch.

Overhype vs under hype in the LLM market: I think it’s good to track what is over hyped and under hyped in the market: OAI improvement and its impact on LLM startups: This bitter lesson from Sutton in AI progress is a good way of thinking using history. I think any technique that gets better with more data/compute is the way to go. So one test is to check if a startup product will benefit from this or not. I have seen companies saying that oh we are not competing with OAI and as their model gets better, our system also gets better. One issue with this way of thinking is that if OAI model improvement improves your bottom line (e.g. coding assistant), then what will stop OAI from taking your market? We have seen this with ChatGPT taking a big portion of the writing market already. LLM startups overhype: We see a lot of investments in LLM startups. But, I think we should define what kind of startups could win and which ones have less chance. For example, considering bitter lessons, do startups working on prompt optimization/monitoring, hallucination detection will win? For example we know prompt optimization has become less relevant as models are getting better. A prompt optimization that helped GPT-3 was almost not important for GPT-4. Also the same for startups working on agentic solutions. This gets less relevant as models become more reliable. And historically we have seen models have been getting better since BERT. So I would not bet against the model's improvement.

How AIGC Revolutionized Customer Acquisition in the Digital Advertising Space

When AI-generated content (AIGC) first started gaining traction, the technology seemed like a playground for creatives and technologists. But savvy marketers quickly saw its massive potential in one of the most competitive industries: customer acquisition.

What started as experimental AI-generated images and videos quickly evolved into highly realistic content. This content was indistinguishable from human-made creations, and people couldn’t tell whether they were interacting with a real person’s account or a purely AI-powered entity. Enter Digital Advertising Affiliates—a new, wildly effective approach to acquiring customers at a fraction of traditional costs.


The Rise of AIGC-Powered Affiliate Networks

In practical terms, here's how it worked:

  1. AIGC Meets Social Media
    Affiliate marketers and MCN (Multi-Channel Network) companies began mass-producing social media accounts across platforms like TikTok, Twitter, and Instagram. These accounts leveraged AI-generated videos and images—polished, hyper-realistic, and tailored to niche audiences—to amass followers. Think “hot girl” Instagram accounts or ultra-luxurious TikTok creators, but all powered by AIGC.

  2. Cost-Effective Customer Acquisition
    These AIGC-powered accounts didn't just gain followers—they became prime real estate for startups. Startups signed deals with these affiliate networks to drive traffic to their apps and products. Unlike running traditional paid campaigns on Google or Facebook, where every impression costs money, this model replaced expensive SEO or Media Buys with self-owned audiences.

    The result? 20x cheaper customer acquisition costs compared to traditional platforms like Google Ads.

    Let’s break this down with a real-world example. Apps like Rizz (a GPT-powered AI chat assistant) were spending $10 on Google or Facebook to acquire a single customer. With AIGC affiliates, that cost plummeted to just $0.50 per customer—an insane margin boost.


Why AIGC Is a Game-Changer for Startups

For startups like Rizz, the economics of this strategy are transformative. Here's why:

1. Lower Cost Per Install (CPI)

Instead of spending $2 or more to acquire a user, this AIGC-driven affiliate strategy brought CPI down to $0.10 per customer. That’s a 20x reduction in cost. The equation changes drastically when you’re not at the mercy of Google or Facebook’s ad pricing.

2. Reduced Risk in Monetization

With costs so low, profitability becomes far easier to achieve. For example:

  • At $2 per user, an app would need at least 20% of users to convert to subscriptions to break even.
  • At $0.10 per user, even a 1% conversion rate makes the app wildly profitable.

To put this into perspective: the average app subscription rate in the U.S. is 5%. As long as your app isn’t crashing on the welcome screen, you’re almost guaranteed to win.

3. Eliminating Dependence on Big Ad Platforms

AIGC doesn’t just lower costs—it frees startups from the monopolistic grip of platforms like Google and Facebook. Instead of paying for every impression, the costs shift to the creation and maintenance of these AIGC-powered affiliate accounts, which have a compounding return over time as they gain followers and engagement.


Why This Strategy Works

AIGC content has two distinct advantages that make it perfect for this affiliate-driven model:

  1. Scalability
    AI-generated content can be produced in bulk at a fraction of the cost of traditional content creation. Whether it’s an AI-generated TikTok influencer or a sleek Instagram reel, AIGC allows marketers to scale campaigns effortlessly.

  2. Authenticity (or the Illusion of It)
    People on social media engage with accounts that feel authentic or aspirational. AIGC can replicate the aesthetics and personalities that resonate with audiences, making these accounts highly effective without requiring a human behind them.


How Startups Can Leverage This

If you’re a startup founder or marketer, this strategy is not just a hack—it’s a playbook:

  1. Partner with AIGC-powered affiliate networks to drive users to your app or website.
  2. Focus on low-cost content production and account management instead of paying ad giants for impressions.
  3. Lower your CAC (customer acquisition cost) to the point where even a modest subscription or purchase rate makes your app profitable.

The beauty of this strategy lies in its simplicity. As long as your app has a decent retention rate, it’s nearly impossible to lose. By leveraging the unique scalability and cost advantages of AIGC, startups can rewrite the rules of customer acquisition, bypassing traditional advertising giants and building a direct, sustainable growth engine.

· 8 min read
Carl Liu

Definition of Liberal Arts: Liberal Arts is intended to provide chiefly general knowledge and to develop general intellectual capacities (such as reason and judgment) as opposed to professional or vocational skills.

Growing up in two different countries, China and Canada, I encountered a common trend: faculties were often divided into the Faculty of Arts and Faculty of Science. However, my experience working as an engineer at Presence, a pioneering AR tech startup, has taught me that what is often underestimated in the tech industry is the value of liberal arts education.

In some extreme cases, engineers believe that hard skills like coding are the only skills that matter, while liberal arts education is dismissed as irrelevant or impractical. However, I argue that this is a flawed perspective. In fact, liberal arts education can be just as valuable as hard skills for engineers working in the tech industry.

Limitation of Engineering Education

Throughout my academic and professional experience in the technology industry, I have come to recognize three major issues that are rarely discussed.

1. Fixed Reward Mechanism

In academia, technical interviews, and in the industry, the standards for evaluating and rewarding engineers are often fixed. Engineers tend to obsess over code cleanliness, optimization of memory and computation usage, and test coverage. While these standards may contribute to the development of better engineers, they may result in less creative problem-solvers overall. In fact, some experts in the field, like Dan Abramov, have highlighted how an obsession with clean code can be problematic. Although there is value in these standards, they prioritize certain skills over others, and consequently, limit engineers' capacity to be well-rounded creators.

Examples of these reward mechanisms include getting an A in a course because your exam answers were elegant, or landing a job offer because you wrote a perfect algorithm that solved a Hackerrank problem faster than anyone else. Additionally, building a better API product than Stripe does not necessarily mean that people will abandon Stripe and use your product.

2. Narrow Field of View

Engineers often have limited interaction with customers. Typically, they focus on writing code, allowing product managers and designers to define the features that need to be implemented. This can result in a bottom-up working model where engineers remain in the background. However, numerous thought-leaders, like Paul Graham and Clayton Christensen, have emphasized the importance of understanding and interacting with customers. In his essay "Do Things That Don't Scale", Paul Graham remarks that:

A lot of startup founders are trained as engineers, and customer service is not part of the training of engineers. You're supposed to build things that are robust and elegant, not be slavishly attentive to individual users like some kind of salesperson... They'd rather sit at home writing code than go out and talk to a bunch of strangers and probably be rejected by most of them. -- Paul Graham

3. Limitation of Experiments

Although science is rooted in experiments, many liberal arts subjects such as design, calligraphy, and music produce results that are not always unequivocal. In music, for example, different compositional genres (romanticism, atonality, jazz) will attract and repel different audiences. This subjectivity within certain fields means that experimentation produces more open-ended results.

Overall, recognizing these three limitations of engineering education can encourage engineers to develop broader skill sets so that they can become better communicators and creative thinkers.

Universal Understanding

The liberal arts were the continuation of Ancient Greek methods of enquiry that began with a "desire for a universal understanding." [1]

I am a strong believer in "Orthogonal Learning", an approach to gathering inputs as much diversity as possible in a learning process. In Computer Science, there is Problem Reduction, which is transforming one problem into another problem. There are many examples of studies in other industries that have helped to explain and solve computer problems:

  • For people familiar with Unity or video game development, the quaternion is a common technique for object rotation used in character movement. I was not aware of this terminology but it is no stranger to folks who studied Physics or Aerospace engineering.
Quaternion (Physics and Aerospace)Leg rotation (Unity)
  • Donald Knuth used a lot of real-life problems to describe algorithms problems in his book "The Art of Computer Programming", the example of using train track design to explain dequeue data structure is fascinating:
Track Design (Civil Engineering)Dequeue (Computer Data Structure)
  • And as a music minor student, composing with chord progression is simply building up a finite state automata:
Chord Progress (Music)Finite-State Automata (Computer Science)