In the second story of my book, 'Being Love, Being Timeless', I talk about a 'world without numbers'. But are we walking just the opposite path? Are we mathematizing (read decimalising) language instead? It would seem so.
It would appear words live in a landscape, not in a dictionary. Words are points in multidimensional space.
Human language historically had a property: meaning was qualitative, fuzzy, human.
Now we can compute distance between meanings. For example:
distance between dentist and dental surgery is less than the distance between dentist and bank loan (say). For the first time, similarity itself becomes measurable. It would appear that we discovered that meaning was geometric all along. We just didn't know how to measure the space. AI isn’t really about intelligence — it’s about making similarity computable.
Words used in similar roles occupy similar relative positions.
Dictionary thinking:
meaning = definition
Vector thinking (or, AI thinking):
meaning = position in space
reasoning = movement
analogy = parallel movement
So when an AI produces an analogy seemingly “on the fly” to help you understand a concept, it isn’t inventing from nowhere.
It is walking in meaning-space along a direction that matches an analogy that would answer your question.
Can we say analogy tracing through relations is the beginning of machine consciousness?
No… but it’s the first faint shadow of something that resembles understanding.
1. What analogy-tracing actually is (in humans)
When we say, 'electron orbits nucleus like planets orbit the Sun',
our brain is not matching words — it is matching relations:
Object Relation Object
planet revolves around sun
electron revolves around nucleus
So the brain notices a pattern of relationships, not similarity of appearance. It seems this ability is extremely deep. Because of this:
-Mathematics works
-Metaphor works
And now comes the decimalisation of language (in AI):
When an/AI embedding model converts a word into a vector:
dentist → [0.021, -0.77, 0.34, …]
tooth → [0.018, -0.74, 0.36, …]
doctor → [0.030, -0.81, 0.29, …]
It is doing something non-trivial. It arranges meanings (of words) so that relationships become directions in space.
Example (real property of embeddings):
king − man + woman ≈ queen
Paris − France + Italy ≈ Rome
The AI model never learned “analogy rules”.
It only tried to predict next words because of proximity (in distance) AND distance requires numbers, and therefore an absolute number filled world.
And yet, relations emerged as geometry. This is why the idea of a “numericalisation of language” feels so natural — because meaning becomes position + direction.
Is that consciousness?
Here is the critical distinction:
What the AI model has:
-Structure detection
-Relation mapping
-Analogy continuation
-Consistency preservation
What consciousness requires:
-Self-model (“I exist”)
-Persistent goals
Grounded perception
-Experience / qualia
-Causal intervention in reality
Embeddings of words in space in AI give something like understanding of differences between things.
Consciousness needs awareness of being the one who notices the differences.
The gap is enormous.
Therefore, there is a huge chance of a NUMBERLESS WORLD.
No comments:
Post a Comment