Category Archives: health

Sanofi launches diabetes sensor for iPhone

Believe it or not, there are still some people who think smartphones are only good for updating Facebook statuses and playing Angry Birds. But even the staunchest of hold-outs are likely to have their minds changed as some real health benefits of internet-enabled mobile devices make themselves clearer over the next few years.

To that end, French pharmaceutical giant Sanofi on Thursday is launching its StarSystem diabetes management platform, which connects to iBGStar, a glucose-monitoring app for the iPhone and iPod Touch. The StarSystem platform is a web-based resource that provides personalized education and health information in the form of online articles and videos, as well as over-the-phone coaching from experts.

The iBGStar is more interesting, though. It’s a $65 attachment that plugs into the bottom of an iPhone or iPod Touch that can analyze a diabetic’s blood samples for glucose levels. The app features a dashboard that tracks and displays levels over time. Moreover, the user can also email the results to his or her doctor.

Sanofi says the iBGStar is the first such mobile device for diabetes, although other health sensors - such as the iHealth blood pressure dock - do exist.

Taken together, such devices are the first steps toward the fledgling health care revolution, where regular people will have much more personalized - and accurate - information about their bodies. With better data, we’ll be able to detect problems before they occur and more accurately treat them when they do happen.

It’s a fascinating revolution that’s well documented by Dr. Eric Topol in his book The Creative Destruction of Medicine, which I’m currently in the middle of reading. Topol argues that while the internet and digital technology have completely revolutionized almost every industry, medicine and health care are still stuck in the relative dark ages. Slowly but surely, the industry is being dragged into the modern light of day. When it finally gets there, we’re going to experience some profound benefits to our collective health.

Smartphone sensors such as the iBGStar are only the tip of the iceberg, however. Things are going to get really interesting once injectable nanosensors, which will monitor us around the clock without any of our own conscious effort, become more commonplace.

1 Comment

Posted by on March 29, 2012 in health, mobile


Population bomb theory is a myth in a vacuum

No sooner had I finished writing about how technology fears are stoked by supposedly learned people and the media than another example rears its ugly head. This time, with the world’s population exceeding seven billion people, it’s new worries of a population bomb.

For those unfamiliar with it, the concept is at least as old as Robert Thomas Malthus, an English reverend and scholar of the late 18th and early 19th century. Malthus believed that if the world’s population kept growing at its then-pace, humanity would run out of food and other resources and experience a catastrophe that would greatly thin out the herd to a more manageable and sustainable size.

Of course, it didn’t happen and it probably never will despite vocal kvetching by modern-day Malthusians, simply because population growth does not occur in a vacuum. Everything else - particularly technology and the economy - grows alongside it. So far, this has served us very well, despite the increasing population.

The reality is that technology, economy and population are interlinked. The more a country has of the first two, the less it has of the third. A quick glance at birth rates confirms it - the rich, technologically advanced countries in North America and Europe typically have the lowest while those in Africa have the highest. Going by those figures, it’s obvious that the more prosperous a country is, the fewer children its people have, for reasons that are equally clear.

Historically, people had many children so that there would be more hands to work the land, but in a non-agrarian society that doesn’t make much sense. Moreover, with both parents typically working, it’s not practical to have many kids, from both a time and expense perspective.

The good news - not that the media ever really reports on this - is the global economy is doing a fine job of alleviating poverty, despite what the lingering economic crisis and Occupy Wall Streeters would have everyone believe. Over the past five years, about half a billion people (most of them in China) were elevated out of abject poverty, something an op/ed in the Jakarta Globe recently called the “fastest period of poverty reduction the world has ever seen.” As the article put it, “advances in human progress on such a scale are unprecedented, yet they remain almost universally unacknowledged.”

Fortunately, some people are taking these developments into account. The demographers at the United Nations know this, which is why they’re projecting the world’s population to peak at about 9 billion about 40 years from now, then decrease. Their reasoning is simple: as people become wealthier, they have fewer children. On that end of things - the input, if you will - population growth is slowly but surely sorting itself out naturally.

All of this growth - whether its population, economic or technological - that we’ve experienced over the past few centuries is hardly a bad thing. People everywhere - in countries rich and poor - are living longer and considerably better than they did a century ago, largely thanks to technological improvements in food production and medicine. Those inputs will continue to improve, so the dire predictions of how food production will need to increase by 70% to accommodate an even larger population may not actually be all that hard to meet. People who worry that the world is running out of food and water are perhaps not taking this inevitable technological advancement into account, the same way Malthus didn’t consider the improvements brought about by the Industrial Revolution.

Sometimes when you live in the forest, it’s hard to see the trees. For practical purposes, it might be hard to visualize some of the future gains the world is going to realize from all the technological advances currently being made, but we can expect with a high degree of certainty that they will happen.

The worriers are also perhaps being too cynical about human nature. While some are right to point out that rich, advanced countries simply waste too many resources, we do have a certain pragmatism too, which explains all the effort being put into developing alternative energy sources and more sustainable food production. If a shortage problem really does happen, it’s reasonable to expect that people in rich nations will lend a helping hand, the same way they did for the African famine in the 1980s and every other disaster since.

Should we waste less stuff? Sure, but until there are real and proven wide-scale shortages of oil, food, water or any other resource, people know on a subconscious level that the Malthusian population bomb theory is just a myth no matter how much the media tries to scare us.


Posted by on November 1, 2011 in evolution, food, health


The long weekend: a chance to whine about prosperity

It’s Victoria Day here in Canada, which means it’s a long weekend. Don’t be jealous Americans - Memorial Day is only one short week away.

There was some pretty big craziness and myth-spreading over the weekend, and I’m not talking about the ridiculousness that was the Rapture. I’m talking about an article in The Globe and Mail about how it’s time for society to make three-day weekends the norm. I have nothing against that idea, but there are significant problems with some of the arguments the writer made in favour of it.

Michael Posner writes:

In an age of high-tech efficiency and higher productivity, why isn’t the working world organized to provide us with more leisure time? The benefits – social, economic, ecological – would be legion. Certainly, we were promised it. For more than a century, a loud chorus of visionaries has lauded the fruits of science and technology, and the personal liberties they would confer.

It hasn’t worked out that way. Indeed, as they embark on their annual Victoria Day weekend – National Patriots Day in Quebec – Canadians (tethered to BlackBerries, laptops and iPads) are more likely to be struck by a grimmer calculus. Our so-called work-life balance has lost its equilibrium. Increasingly, we are logging longer hours. Increasingly, we have less time for recreational pursuits.

The article goes on to cite a number of studies showing that leisure time hasn’t really grown much and that productivity gains have been squandered. Many readers agreed with the story and commented on how it’s all a plot by corporations and the rich to keep the little guy down.

Yowza. How spoiled we are.

Common sense and a look at some real numbers from the 20th century handily disproves all this. According to the U.S. Bureau of Labor Statistics, children made up about 6% of the labour force in 1900. By 2000, of course, that number was zero. Per capita income at the turn of the 20th century was $4,200 in 1999 dollars, while at the end of the century it was $33,700. Benefits accounted for 1% of compensation in 1929, the first year measured, while by 1999 it was 27.5%.

Work hour improvements have proven tricky to measure given that proper statistics haven’t been kept by industries for long, but those that do exist show improvements. The average private sector work week in 1999, for example, was 34.5 hours - or 6.9 hours a day. A manufacturing job in 1900 - which would have been a very common occupation at the time - generally required 53 hours, or 10.6 a day. That’s a decrease of 34% - not too shabby.

Furthermore, what the numbers don’t show is the nature of that work. In 1900 a typical job involved back-breaking labour. In 1999, while it can be argued that the typical job was just as dreary, sitting behind a desk was considerably more comfortable. Indeed, people were far more likely to be killed or injured on the job in 1900 than in 1999. As the Bureau says: “Whether accidents are fatal or not, statistics indicate that they are less common, and the workplace is a much safer place, for the worker at the end of the century than at the beginning.”

To summarize: working in the 21st century is more comfortable, lucrative and safer than it was a hundred years ago, with the added bonus that most families don’t require children to help put food on the table. That has contributed to more affluence, longer life and better health over all. Indeed, this comfort is actually helping to making us fat and lazy as the move toward less physical jobs is considered a big factor in rising obesity levels.

The Bureau says a number of factors have improved working conditions, but the main one - naturally - was technology. Cars, computers and cellphones are just a few items in a long list of innovations that contributed to all of the above.

So how exactly have things not improved? Despite what the Globe article might suggest, answering an email on the BlackBerry while at the cottage is not a bad thing. Indeed, the emancipation that technology promised has in actuality arrived. The first shift saw workers move from difficult and potentially life-threatening labour to much cushier jobs. The next shift - the one we’re in now - is allowing people to break the chains that have traditionally bound them to desks. That might mean that some of those work hours shift over into what have traditionally been considered leisure hours and vice versa. The overall blend, I would think, is a beneficial one.

My advice: sit back and enjoy the long weekend and let other people whine about how good we have it.


Posted by on May 23, 2011 in health


Vampire popularity: it’s technology’s fault

I’ve been developing a bit of a weekly ritual in the months since I struck out on my own, a tradition I’ve taken to calling Terrible Movie Tuesdays. As the name implies, it’s the day I go to the theatre and see terrible movies or, generally speaking, films my fiance has no intention of seeing. Me? I just like going. The lower admission charge on Tuesdays and the fact that theatres are usually deserted during the middle of the day adds up to a bit of fun solo time, even if that time is usually spent having my intelligence insulted by bad cinema.

Yesterday I checked out Priest, a movie that was ultimately as bad as it looked (you know it’s going to suck when there aren’t any advance screenings for critics). The premise, such as it was: humans and vampires have been at war for time immemorial. But, with the help of ninja-like priests, the Church has beaten back the blood-sucking hordes and imprisoned them on reservations. That leaves the holy folks free to run the world. I know - talk about a horror film!

The less said about the movie the better, but it did get me thinking about vampires. I’ve come to bloody well hate them (pun intended) just because they’re so overdone and oversaturated. If it’s not bad vampire TV shows - ahem, True Blood - then it’s bad vampire movies like Priest and bad vampire novels, which in turn get turned into bad vampire movies (yup, the one and only Twilight series).

I used to love vampires - my favourite TV show of all time remains Buffy the Vampire Slayer while one of my fave books is the original Dracula - but that was when they appeared sparingly. The amount of vamp-lore out there today, however, is getting out of hand. Knowing that the entertainment world works in cycles - where a genre is hot one minute and gone the next - I’ve been waiting for the eventual wane of vampires. But it just doesn’t seem to be coming.

So I’m wondering: is there something more to this vampire thing, other than it’s the current craze of the entertainment business?

The answer, I think, may be yes and - as always - it may be because of technology.

Vampires come in many different shapes and sizes - the ones in Priest were similar to the aliens in Aliens, thereby indicating that the “script” (and I shudder to call it that) may have originally starred some other kinds of monsters, but was then rewritten to suit the current flavour. All the different vampires do, however, generally have a couple things in common, most notably the need to continually drink blood and the ability to live forever if they do so.

Here’s where technology comes in. Thanks to science improving food, medicine and general prosperity in the world, people are living longer and longer, particularly in the developed world. Over the past half century, life spans have increased about 10 years to around 80 in such countries. In the nineteenth century, the life expectancy was under 40. Even when those numbers are tempered somewhat by a decline in infant mortality, a typical 21st century human would probably seem nearly immortal to someone living two centuries ago.

Moreover, we’re on the cusp of a significant further expansion in life spans, which are topics I’ll be getting into in book #2. Biotechnology, robotics, nanotechnology, computing and discoveries in neurology are all going to provide the keys to first slowing aging and then reversing it. It sounds like science fiction but some of it is already possible. Where big advancements need to happen is not so much in the science but rather in our attitudes toward it.

I wonder if this trend of longer life is something people are subconsciously aware of, hence the popularity of vampire fiction: we want insight into the effects of immortality - and as much of it as possible. Will it make us better people, like Angel; will it make us sexier, like those Twlight dudes; will it make us annoying, like most of the vampires on True Blood; or will it make us giant douches, like Anne Rice’s Lestat? (Tom Cruise evidently has a head start.) In a nutshell: perhaps we’re curious about what longer life means, so we want to see it dramatized in entertainment from as many different angles as possible.

Then there’s the fact that the whole mythos of the vampire is based on a Faustian trade-off: eternal life in exchange for eternal killing and drinking of blood. Does the popularity of vampires reflect some sort of angst over a similar real-world trade-off, that science’s continual life extension is coming at a cost? Is our headlong pursuit of technology somehow costing us our humanity, which is a theme that’s been explored in countless science-fiction stories from Frankenstein onwards?

I’m not sure, but obviously Terrible Movie Tuesdays have some merit to them. Being bored by what’s happening on screen provides a lot of time to ponder such questions and I surely can’t explain the continual popularity of vampires any other way.

Anyone else have any ideas?


Posted by on May 18, 2011 in health, immortality, movies


2021: Genetically engineered cures, not band-aids

The biotech revolution got underway back in 1982, when San Francisco’s Genentech won FDA approval for Humulin, a bioengineered form of insulin. Thirty years later, the vast majority of pharmaceuticals and food in North America is the product of some sort of genetic engineering. Europe has been slower to get on board because of an outbreak of Mad Cow disease in the late 1980s. The epidemic actually had little to do with genetic engineering - it was caused by essentially feeding cows junk food that contained various chemicals and anti-biotics - but it nevertheless caused a chill on biotech on the continent. Africa, dependent on selling its foods into Europe, also shunned them.

Things are changing. The furor has died down, genetically modified foods are slowly creeping into Europe and some prominent environmentalists who used to oppose them have changed their tune. In North America, we’re entering the next stage of GMOs: genetically modified animals, designed for more efficient human consumption or for better environmental use. The AquaAdvantage salmon, a fish that has been engineered to grow faster, is close to receiving FDA approval. The EnviroPig, which produces less methane, will be close behind.

Meanwhile, the first GMO plant designed with a humanitarian - and not profit-driven - purpose in mind is also close to being rolled out. Golden Rice will finally become available at some point in the next few years.

Scientists are starting to get good with this technology. Biotech is entering another generation, where organisms are not just being modified to have one new trait, such as secrete their own insecticides. They’re starting to stack multiple traits - Monsanto’s SmartStax corn, which creates its own insecticide and which is resistant to herbicides - is a good example. The law of accelerating returns is starting to take effect with biotech, so we’re going to see some major advances over the next decade.

These developments are going to combine with some big leaps being made in health. I recently blogged about some discoveries made in treating flu bugs and AIDS - these and other breakthroughs are being helped by information compiled through the Human Genome Project, which is continuing to give scientists new insights on how humans function and how their deficiencies can be cured.

I know it sounds Utopian, but by 2021 biotech and medicine will have gone a long way toward solving many food and health problems. I posted a Chris Rock video in that post the other day where he joked that there is no money in curing disease, only in treating it. That’s true right now but 10 years from now, we will not only have the biotech tools to eliminate some diseases, we will also be considerably more comfortable with the technology. That’s when serious conversations will begin about whether it should go beyond plants and animals and be applied to humans.

In other words, there will be a push to start dealing with root health causes with biotech rather than simply putting genetically engineered band-aids on them. So-called “designer babies” are already possible, but the idea horrifies many people. Ten years from now, that attitude will have shifted as people realize it is inefficient to spend money treating a disease or defect when it can simply be eliminated.

Comments Off

Posted by on March 3, 2011 in biotech, drugs, GMO, health


Get every new post delivered to your Inbox.

Join 2,866 other followers