When ‘Log Kya Kahenge’ becomes ‘AI Kya Kahega’

12 April,2026 08:09 AM IST |  Mumbai  |  Nishant Sahdev

Why your battery percentage and typing speed have replaced the nosy neighbour as the ultimate judge of your character

Pics/iStock


Your browser doesn’t support HTML5 audio

For decades, privacy in India was never really about being hidden - it was about being understood, or at least, being allowed to be misunderstood. A secret was rarely private; it was a shared fiction. Families knew parts of it, neighbours guessed the rest, and society agreed to look away as long as you played your role well enough. The real anxiety was judgment. Log kya kahenge? But even that judgment had a human loophole. It could be managed with some conversation.

That era has ended. As AI matures from a tool into a reality, we have traded the nosy neighbor for a clinical, invisible, and far less forgivable observer. We haven't just moved our lives online; we have surrendered the very possibility of being misunderstood. We are entering the age of Predictive Omniscience, where the machine doesn't need to hear your secret - it simply calculates the shadow your existence leaves behind.

The metadata autopsy: Character as a calculation

The death of the secret began with "Inference" - the machine's ability to derive unknown truths from known behaviours. A decade ago, the "Target incident" served as an initial warning: a retailer's algorithm identified a teenager's pregnancy before her family did, simply by tracking a change in her preference for unscented products.

In the rapidly digitizing economy of India, your character is being dismantled and reassembled via metadata. When you download a lending application, the system bypasses your bank balance to audit your Entropy. It notes that your battery frequently hits 1%, flagging you as "unstable". It measures the micro-latency in your typing, correlating rapid "delete" key usage with financial impulsivity. It scans your contact list - not for names, but for order. A phone book filled with nicknames and emojis is now a statistical proxy for low social capital.

This is Metadata Morality. The machine has judged your reliability before you have even submitted the application. It doesn't need your facts; it only needs your habits. In this cold, algorithmic reality, there is no room for the "human excuse". The algorithm doesn't care that your phone died because you were working two jobs; it only sees the 1% battery as a mathematical red flag for insolvency.

The digital caste system: Predestination

We are witnessing the emergence of a Digital Caste System - a new form of inherited social stratification. In the traditional caste system, birth dictated access to resources and social mobility. The Digital Caste System performs the same function with greater efficiency. Unlike a credit score, which can be improved through discipline, your digital status is becoming irreversible. You cannot "un-type" the metadata of your past or "un-live" the statistical patterns the machine has indexed.

Furthermore, this status is becoming inherited. Under the new Account Aggregator frameworks, your data is no longer an isolated island. Your financial hygiene is linked to your household; your health risks are linked to your parents' data; your "trustworthiness" is a cluster score shared by your social graph. If you are born into a "Low Trust" digital cluster, no amount of individual effort can override a machine's 98% certainty that you are a high-risk liability. We are building a world where children inherit the digital "sins" of their parents, locked into a stratum of society where the gates to credit, insurance, and elite employment are closed by an algorithm that never forgets.

The bio-digital kundali: The end of choice

In India, we have long looked to the Kundali - the horoscope - to predict fate. AI is the Bio-Digital Kundali, and it is becoming lethal in its specificity.
Consider the coming shift in matrimonial vetting. Rejection will not be based on a mismatch in salary or caste, but on a "Conflict Probability" score that predicts a 60% chance of divorce within five years. Algorithms already analyze "Emotional Entropy" during video calls - analysing the micro-expressions and linguistic patterns that show incompatibility long before the first argument occurs.

Families will soon demand a Digital Compatibility Index. A groom may be rejected because his "Genetic Debt" - calculated from leaked family health archives - suggests a high probability of chronic illness. The secret of a family health history or a volatile temperament is not to be hidden; it is a data point on a dashboard, a bio-digital barrier that no social performance can overcome.

The theft of futures

The fundamental danger stems from a transition toward predictive certainty, moving beyond mere historical analysis. Using models such as "life2vec," AI can now process the life trajectories of millions - incorporating job titles, salary fluctuations, and medical records - to "autofill" the conclusion of a person's life with a staggering 90% accuracy.

This is the ultimate theft. We are protected from the theft of our facts, but we are completely exposed to the theft of our futures. If a machine can predict professional burnout or a health crisis years before the first symptom, you are no longer a participant in your own life. You are a passenger in a pre-calculated trajectory.

This leads directly to Pre-emptive Denial. You are not rejected from a job because of a bad interview; you are rejected because the hiring AI has calculated a 70% probability that you will quit within eighteen months. You are denied insurance because your grocery data suggests a predisposition to a disease you haven't developed yet. In the old world, you were punished for what you did. In the new world, you are penalized for what the math says you will do.

The new social contract

The transition from human-centric "Social Judgment" - imperfect and prejudiced yet inherently human - to the platform's "Algorithmic Calculation" is now complete. While a prying neighbor might be swayed by a proof of integrity or a shared past, such human performances fail before an algorithm. It remains indifferent to your "vibe," unswervingly committed to the cold, clinical computations of your personal metadata.
Society is transitioning from Log Kya Kahenge to Statistical Predestination. Our pursuit of efficiency has created a digital Panopticon where secrets can only be kept by ceasing to exist.

The machine has already decided. It has indexed your risks, calculated your lifespan, and assigned you a permanent place in the new social hierarchy. Your "second chance" was never in the code. In the drive for optimization, we haven't just lost our secrets; we have lost our right to be more than the sum of our data. The secret is out, and the verdict is final: your future is no longer yours to write. It has already been calculated.

Nishant Sahdev is a theoretical physicist at the University of North Carolina at Chapel Hill, US, AI Advisor and the author of the forthcoming book The Last Equation Before Silence.

"Exciting news! Mid-day is now on WhatsApp Channels Subscribe today by clicking the link and stay updated with the latest news!" Click here!
Artificial Intelligence india Technology News Technology Lifestyle news
Related Stories