- Himanshu Ramchandani
- Posts
- Tokens, AI Hype, OpenAI to Challenge Google, AI on Phone
Tokens, AI Hype, OpenAI to Challenge Google, AI on Phone
Hype vs Reality, AI Engineers burnout, Token VS Parameters, How to win an Argument, 10 Ways to use AI on your Phone
Hi You!
Saturday’s are for AI News, Learning, Career, Mental Toughness, Productivity, Community, and Resources.
Just trying to add more value to your life!
Happy AI.
Today’s Content →
AI Leadership👑 → AI hype of half-working solutions VS Reality
News Hits🗞️ → AI Engineers Burnout, OpenAI to Challenge Google
Concept 🧑💻 → Tokens VS Parameters
AI Career & Job 🚀 → AI Engineer and Researcher
Mental Toughness🦾 → How to win an Argument
Productivity♻️ → AI on your Phone
Community⭐ → Like-Minded Humans
AI Leadership 👑
AI hype of half-working solutions VS Reality📌
You must have been overwhelmed by the AI hype lately.
It’s hard to differentiate between the actual advancements and the marketing overdo.
Hype happens when companies talk about their technology to appear more advanced than it is.
When businesses do this, then it leads to unrealistic expectations.
2024 is the war of attention, the one who pulls it will make a fortune.
This might help them attract investors, partners, and customers in the short term.
But in the long term, the customers will be disappointed.
News Hits 🗞️
OpenAI to Challenge Google with Its Own Search Engine in May Read
AI Engineers report burnout and rushed rollouts Read
Concept 🧑💻
Tokenizer by OpenAI
Tokens were considered individual words or 3 to 4 characters, but it’s false.
Tokens can be individual or partial words, as seen in the above image.
Large Language Models use tokens to measure 3 things →
the size of the data they trained on
the input they can take
the output they can produce
The tokens will be converted into numeric embeddings, as all types of models process numbers only.
The GPT was trained on more than 500 billion tokens.
The GPT was trained on 175 billion parameters.
Both the statements are true.
Parameters are the memory of the model or the weights that a model determines based on the training data.
The GPT was trained on data and created this huge complex n-dimensional matrix of numbers we call parameters.
Anology→
When we as humans learn something, we try to get all the information(data) that we can break down into tokens, then we create our understanding and remember only important things about it (parameters).
Mental Toughness🦾
How to win an argument without having solid reasoning and proof
A thread that should be illegal:
— E-go (@EgoDriv)
11:09 AM • Sep 3, 2022
Productivity♻️
I'm surprised how many people still don't use AI on their phones.
It can make your life easier and increase overall productivity.
Here are 10 ways to use AI on your phone (all free) :
— Tulsi Soni (@shedntcare_)
5:00 AM • Apr 28, 2024
Reply