Here and now: on population and the human condition

In 1921, Raymond Pearl, Professor of Biometry and Vital Statistics at Johns Hopkins made a starting prediction that the upper-limit of population which could be sustained by the United States would be reached in 2060 and that it would be around 200 million souls. Dr. Pearl employed logically-sound analyses extrapolating from the Malthusian doctrine prevalent at the time, as well as state-of-the-art calculations of growth-curves for other nations and other animal populations such as fruit flies. Of course, his calculations were wildly off: in 2011, the United States reached a population of over 310 million.

In October of the same year, according to some analysts, the world’s human population surpassed a statistical 7 billion. Of course, no one really knows when that demographic milestone was reached and the selection of the 7 billionth infant in 2011 was largely a symbolic act. Regardless, the world’s population is in the midst of an explosion with no immediate signs of stabilizing.

Although there is no way for knowing for sure, it is estimated that the population of the world surpassed one billion in the early 1800s. To double to two billion it took over 100 years, but reached three billion in only thirty or forty more years. By most official estimates, the global population exceeded four billion the year I was born, five billion when I was in the sixth grade, and six billion by the time I got my master’s degree. Given these trends, I do not think I will need to live very long to see it double in my own lifetime.

Since the mid-Seventies, Carl Laub of the Population Reference Bureau has attempted an audacious exercise. Laub has been trying to reach an estimate of the number of humans that have ever lived, starting from a statistical first couple at 50,000 B.C. In mid-2011, he arrived at approximately 108 billion humans (give or take a few billion). His analysis presumably does not take into account an evolutionary bottleneck caused by a supervolcano which erupted around 70,000 B.C. causing only a few thousand of our ancestors to survive, but otherwise seems plausible. It is in the ballpark.

If that is indeed the case, 108 billion is the total sum of every man, woman, and child that has ever lived. This number included every Buddha, Aristotle, Alexander, Genghis, Shakespeare, Kalidas, Hitler, Da Vinci, Stalin, Lincoln and Newton. It also includes the vast multitudes of humanity who, for better or worse, never reached their true potential in life and thus, disappeared into the quicksand of obscurity.

But even the people we have memories of lived very recently. As Carl Sagan astutely noted, if you put the cosmic scale on the Gregorian calendar, all of human existence would pass by on the last ten seconds of December 31. Even if we stretch this timeframe to include only the time humans have been on the planet, there is no existing narrative for most of our existence. The oral, mythical traditions of our preliterate ancestors –and the written word is a very recent construct even for those fortunate to have access to it in modern times – have mostly vanished. For 99.9% of human existence, our ancestors were hunter-gatherers. How long did they live? What did they die from? We will never really know. Our ancestors are alien to us.

No wonder the challenge in explaining the remaining recent 0.01% puts the social sciences on unenviable ground. At best, history and sociology can provide fragmentary papyruses on the few individuals, customs, and events which have influenced us the most. At worst, as conflicting accounts of the tragedies at Nanking and Dhaka in recent times show, oppressor and “oppressee” will obfuscate any approach towards an absolute truth, so that it is unknowable even for the most detached of observers.

The physical and life sciences cannot bring an absolute framework of knowledge to pass either. For example, biologists cannot study (or possibly even identify every living organism) that was or is alive on this planet. A few model organisms, such as E. coli, yeast, worms, and fruit flies are studied in extensive details to make associative inferences, and we will continue to know more, but scientific knowledge will always be fragmentary too.

If scratching beyond the surface of the duration of human existence is impossible, if absolute knowledge is a mirage, it does not diminish the value of knowing what is knowable. As Thomas Kuhn hypothesized, it is the relatively short spurts in which revolutions and paradigm shifts occurred that human history was likely shaped.

I am inclined to take the democratic view that human culture is the sum of what is important to all humans. If that is so, then our modern times, hold special significance. Over 6% of every human who ever breathed on this earth is alive now. (On a sobering note, this also means that over 6% will die in a lifetime: an unprecedented scale of deaths the likes of which have never been seen).

In absolute terms, this puts immense pressure on society. Languages, for example, are under extreme duress to conform to the needs of an unprecedented number of users in extraordinary situations.

For the most of human existence, apart from just after wars, pandemics, and famines, the present has been the most important time in history. And if today is the most important day ever for humanity, then it behooves us to ensure we do whatever is possible to improve the collective human condition.



I have a problem that needs fixing. I am a bibliophile with limited space on my shelves.

I have put off this problem for a long time wishing that it would go away by itself, but now I urgently need to make space for new pages which will inevitably encroach upon my small apartment.

One way to fix the situation would be to get new bookshelves with more space. Another way would be to pile the books I have amassed in some sort of crystalline close-packing form in storage ottomans. To a certain extent, I have employed both tactics, but short of moving into a larger home –and I dread the thought of lugging around the books I’ve already amassed – the fact remains that the dimensions of my physical space will not change.

I do read words in digital format on the multiplicity of devices I have accrued, and this certainly has made carrying around hundreds and even thousands of books convenient. However, many of the real books I own have a history of their own, have been out of print, or were bought second-hand for a pittance. In addition, none of the books I own in Bangla are available in digital format. Others, like my marked-up copy of Darwin’s On the Origin of Species, have sentimental value.

For the longest time, I thought of shipping some of the books I’d already read back to the home I grew up in. I’d love to revisit some of those books, but am also quite certain that I won’t be able to do so anytime soon. For as Richard Ford astutely noted, “rereading’s actually an expensive and baulky luxury, since our roads are already lined with all those books we haven’t even read the first time and that have a first claim on us if we could ever get to them.”

I was never overly concerned that it would cost me more to ship the books than it cost me to purchase them in the first place. The worth of a book has no direct relation to the price that was paid for it. However, in the end, I decided against having my bulky books shipped across the oceans to a house, which I will visit, perhaps, for a couple of days every few years, a house which is no longer my home.

So, the problem remained and I decided that the only way to tackle it was to discipline myself and to clean up my shelves sorting out what I would give away to others. Well, that was much easier said than done. Giving up old magazines took little time, because I could convince myself that with the steady flow of information in more current issues, I would never have the time to revisit the old ones. Books presented a more difficult predicament.

There were books I had bought but never read. There were books that I had read, which I promised myself I would read again. There were books I wanted to nibble on from time to time. There were ones I wanted on my shelf for no particular logical reason. And there was my bound dissertation which was a reminder of an era of my life more than any reference: it had a very short shelf-life in the world of scientific advances. I literally had to have a conversation with myself when deciding the fate of each one.

For even though I prided myself in being a swift reader, I knew it would be impossible to pay attention to every book on the shelf. Just as a starving man wants to keep a pantry full of various types of comestibles, ever since I’ve been employed, I’ve created a smorgasbord for my omnivorous reading habits. I always dreaded not having a choice and as a consequence I have hoarded.

The fear that I would have absolutely nothing interesting to read was not entirely unfounded. After school was out one summer, I had been so starved of reading material, that I spent weeks reading random entries in the Encyclopaedia Britannica, an act which in hindsight, is probably only a few notches above copying the text word for word (performed by the fictitious Jabez Wilson in the “The Red-Headed League”). At the time, I had clearly run out of books to read and the means to restock my shelves.

Thinking of Encyclopaedia Britannica does bring back memories! You of the Expedia, Wikipedia, and crowdsourcing generation have no idea the aura of prestige owning a set of Micropaedia, Macropaedia, and Propaedia conferred on the owner. Just as hosts show off their fine china or DVD collection these days, my father showed guests our bookshelves. It was not Collier’s, Funk and Wagnall’s, or the World Book encyclopedia that we owned: we owned a set of Britannica. It was The Encyclopaedia and it damn well took up an entire wall.

That is just how things were back then. We went to the Encylopaedia when faced with intractable geeky problems, and we left usually unsatisfied.

In a perverse way thinking of the Encylopaedia Britannica gathering dust on a bookshelf in the home I grew up in, its pages now yellow and the information obsolete, made the task of sorting the books I need to part with much easier.

The long continuum

A mountainous range stood before the sympathetic tedium
Revolver so sound of mind not free to remember
It operates with happy, dull abandon,
Figment of the imagination? Never… The hurting went on
A chain screams noisily, but no one ever listens…
A bread or a radiant dragon is the key
As Cleopatra’s heart melted at the sight of death.
Down by the babbling brook the cow dreams.
Whining out in frustration, the goose knifed violently.


A chain screams noisily, but no one ever listens…

What emotions ran through the writer who conjured these images? What pain took root in a sick, depressed mind?

Only, “emotions” is perhaps, not the right word. You see this poem was not written by any human hand; it was spit out by a computer algorithm according to very formal rules of grammar programmed into a system. I only hastened the process by feeding it a few words.

Read it again. Although one line can be loosely threaded to the next, there is really no continuity. Each line is a discrete string of words which the computer constructed independently of others.

However, despite the awkward syntax running through this nine-line poem written in free-verse and the incongruity of certain metaphors, it is contextually similar to other poems I have read. It could have been written by someone I know. I noticed the improbability of the emotional quotient only because I was privy to the secret of its creation.

How do you critically evaluate a work of art when you know it was not created by a human? Does it even qualify as art? Perhaps, other questions need to be asked as well. During the process of assessment, should the evaluator be blinded from the creator’s identity, so that the focus is solely on the intrinsic merit of the work? How much does algorithm, the syntax, and the context matter?

Who is to say that whatever is synthesized mechanically without emotional input is not art if it holds the power to elicit an emotional output?

In our minds, a machine is still only machine. It cannot breathe life into the lifeless. We pick up the phone and want to speak to another human: we want to connect with another sentient being capable of empathy. We want to know that there is an element of spontaneity, an aura of unpredictability, and that the rules can be bent. We want to have a conversation, not trigger voice activation.

If you could program spontaneity, would it still remain spontaneity? No, of course not. But perhaps, this is a moot point, for even in a randomly chosen human, there are only a finite number of possible responses to any situation. Perhaps, our machines only need more processing power.

A poem is not a voice-activated roboparrot. The written word is slightly different from a conversation in real-time. The writer has already spoken. The reader is in the process of digesting what has been said and simultaneously formulating a response.

Even so, our reaction is always calibrated to what we already know about the writer. We would not react the same way, if a computer could provide an element of expectation.

We feel cheated. In crowded surroundings devoid of interpersonal transactions, when we turned to look, the forbidden touch which elicited joy or disgust was not flesh pressing against flesh, it was plastic.

The brilliant mathematician, Alan Turing devised an ingenious test. A human interpreter tries to gauge from standard responses from a human and a machine which is which.

The philosophical implication intrigues me more than any practical consideration. Can machines be made intelligent? The machines which churn out digital poems will not create something with the finesse of a Shakespeare. But that misses the point. Only one human poet was Shakespeare. The range of human responsiveness indicates that the goal is highly subjective. A human could write like a computer. A human can easily flunk a reverse-Turing test, much in the same way that Charlie Chaplin once lost a Charlie Chaplin lookalike contest. The blurry combinations of letters we need to type to prove we are human are getting increasingly complicated. We need to get better to recognize them.

The race to create machines like humans rarely takes into consideration the fact that we are becoming increasingly machine-like. We are the ones being programmed psychologically in a corollary to the Turing test. Perhaps, one day a human will be indistinguishable from a machine at a flickering cursor on the long continuum.

In memory of Alan M. Turing (1912-1954).

115 words

Miles of sand melt into a featureless horizon. Is each grain of sand different? A vital piece of an architectural puzzle?

I am in a rush to go nowhere in particular.

We are a tiny fragment of an incomprehensibly large universe and a gigantic sum of imperceptibly small atoms. Both the massive and the minuscule are empty. From one star to the next is the vacuum; from the ground state to the excited is the void.

Life is short on a cosmic timescale and long compared to the half-lives of super-heavy atoms.

In a lifetime, a few flickering memories pierce the plasma of monotony, until they too collapse as dying stars into oblivion.