Sunday
Aug132017

Do smartwatches have a future?

The hype over smartwatches reached fever pitch three years ago, with plenty of industry and consumer buzz over the Pebble watch and the impending release of the Apple Watch. Today, few people even bother to wear a watch and even fewer have a smartwatch.

WHAT HAPPENED?

The challenge smartwatches had from the start had to do with two key areas. Firstly, there’ the issue of its small screen. As mobile phones got bigger and bigger, people got used to the idea of a large screen to interface with. But watch screens have serious limitations when it comes to how large they can be before it starts looking ridiculous.

Secondly, smartwatches are not standalone devices but are dependent on mobile phones for connectivity and thus, functionality. If you want to go jogging with your smartwatch, you have to also bring your phone along if you want to be alerted of messages or to stream some music.

These two challenges — the small screen and the reliance on a phone — naturally begged the question what utility can a smartwatch offer that we can’t already get from a mobile phone? Anything that a smartwatch can do, a smartphone can do better. So why would you need a smartwatch? The answer is you don’t. And that’s why the smartwatch industry is in a funk.

Pebble, the company that started the whole smartwatch craze couldn’t survive on its own and had to be bought up by Fitbit. Motorola and Jawbone have stopped making smartwatches. Fitbit and Samsung, both of which initially had great hopes for smartwatches, have barely made any impact in the market. Even poorer performing is the Android Wear operating system for smartwatches, which Google licences out to various watch brands.

There’s really only one player in town that has made any kind of impact: Apple. And even then, it’s arguable that the appeal of its smartwatch is not the fact that it’s a smartwatch but that it’s a watch made by Apple, a brand with super devout fans.

According to an industry report by research firm Canalys released earlier this year, 49 per cent of smartwatches sold in 2016 were made by Apple; 17 per cent by Fitbit; and 15 per cent by Samsung.

EVOLVING TO STAY ALIVE

Let’s look at the top players in the smartwatch space and see how they’re evolving in their bid to revive interest in this sector. Apple is doing two things differently. Firstly, it’s going back to basics when it comes to its Apple Watch functionality. Gone are the aspirations for it to be a smart device that runs all kinds of apps. Instead, it’s now focusing on two core features: message notifications and fitness tracking. That’s a smart move because frankly, that’s all people would really expect of their smart watches. They don’t want to run 101 different apps on a device where the screen is so small.

The other thing Apple is reportedly doing is equipping its watches with mobile network capabilities. This will give it Internet capabilities without the need to have an iPhone nearby. Then it would be possible to receive messages and to stream music without the need to be connected to a phone. That would make quite a lot of difference. While this feature alone might not necessarily be a game changer, it will make the device more independent and not a complementary device for the smartphone.

Fitbit has just released its second-quarter earnings report and sales of its watches are up 14 per cent sequentially from the first quarter of 2017. But that’s not necessarily good news. The amount is actually down 40 per cent year-on-year from the second quarter of 2016. As a result, the company reported a net loss of US$58.2 million (RM249million). So, it’s not in good shape even though it’s number two in the smartwatch space.

As mentioned earlier, Fitbit had bought smartwatch pioneer, Pebble, and is utilising that company’s software to bolster the functionality of its smartwatch. So, as Apple is narrowing its focus from an all-out app-centric device to something designed for message notification and fitness tracking, Fitbit is doing the exact opposite and trying to be more like what the Apple Watch was.

Fitbit CEO James Park told online tech publication, The Verve, that the company’s upcoming smartwatch will have an app platform and that it will be rolling out a software development kit (SDK) along with a select number of apps from specific partners. The SDK will eventually be available to all developers.

Google is a smartphone player too through its Android Wear operating system. While it makes its own smartphone —called The Pixel — it’s not really bothered to make its own smartwatch. Instead, it just licences out the operating system to various established and non-established watch brands.

Fossil and Tag Heuer are two established brands that make watches with the Android Wear operating system but their sales numbers are so small they hardly make a blip on the smartwatch radar screen. Meanwhile, some non-traditional watch brands like Motorola and Huawei seem to have lost interest in Android Wear. The fact that Google itself is not making its own branded smartwatch speaks volumes.

FUTURE OF SMARTWATCHES

In looking at all these developments, what can we expect for the future of smartwatches? I don’t think smartwatches will ever become a big device category on its own. People just aren’t wearing watches like they used to.

It will most likely be a nice-to-have rather than must-have device that some people will choose to wear — some for fashion purposes and some for fitness tracking but not really for telling the time because you can get that just by glancing at your phone.

Direct connectivity to mobile networks will probably become a standard feature. Once Apple introduces this, every other player will have to follow. As long as the smartwatch is seen as a device that complements your mobile phone, its sales will be limited. Having direct connectivity means that it’s possible to receive messages and stream music while you’re exercising, which may make the device attractive for fitness-conscious consumers because it means you don’t have to bring your phone with you when you exercise.

Over time, the phrase “smartwatch” will disappear and people will just call them watches — just like how nobody really calls a smartphone by that name anymore. We just call it a mobile phone or hand-phone because all phones are smartphones these days. And that’s probably what will happen to smartwatches too although unlike the smartphone, they will be far from ubiquitous.

Sunday
Aug062017

Is AI a danger to humankind?

The term “Artificial Intelligence” (or AI) tends to conjure up images of killer robots in movies like The Terminator, Blade Runner and Avengers: Age of Ultron. All these movies warn of the dangers AI poses to humanity but it’s the Avengers movie that really captures the fear that some technology and scientific luminaries have warned us about. In that movie Ultron, a sentient robot created by the Tony Stark (Ironman) character, concludes that in order to save Earth, it has to eradicate humans.

This is exactly the kind of thing that Tesla’s Elon Musk, who’s a bit of a Tony Stark-like figure, has been warning for years. In 2014, he famously likened the unregulated development of AI as akin to “summoning the demon” which cannot be controlled.

If you think his views are alarmist, you should know he’s far from alone. Physicist Stephen Hawking has also warned about the potential dangers of AI. “I believe there’s no deep difference between what can be achieved by a biological brain and what can be achieved by a computer,” Hawking said. “It therefore follows that computers can, in theory, emulate human intelligence — and exceed it.”

Not only that, he fears that AI robots could re-design themselves at an ever-increasing rate and that humans, who are limited by slow biological evolution, wouldn’t be able to compete and would eventually be superseded by the AI agents.

Both Musk and Hawking are members of the board of advisors for The Future of Life (FLI) Institute which lists four existential threats to humans. These are nuclear weapons, biotechnology, climate change, and last but not least: artificial intelligence.

In 2015, FLI launched its AI Safety Research programme —funded primarily by a donation from Musk — whose purpose is to finance researchers and institutions to initiate projects that will help ensure artificial intelligence stay safe and beneficial to humanity.

Alarm bells ringing

Just last month, Musk warned about AI again, this time to a gathering of US governors. He said: “I have exposure to the most cutting-edge AI. I think people should be really concerned about it. I keep sounding the alarm bell, but you know, until people see robots going down the streets killing people, they don’t know how to react, because it seems so ethereal. I think we should be really concerned about AI.”

Musk, no fan of regulation, feels that AI is one sector that does need regulation. “AI is a rare case where I think there should be proactive regulation instead of reactive. I think by the time we’re reactive in AI regulation, it’s too late. Normally, the way regulations are set up is that a whole bunch of bad things happen, there’s a public outcry, and then after many years, the regulatory agencies are set up to regulate that industry.”

Musk went on to say: “There’s a bunch of opposition from the companies who don’t like being told what to do by regulators, and it takes forever. That, in the past has been bad, but not something which represented a fundamental risk to the existence of civilisation. AI is a fundamental risk to the existence of the human civilisation. In a way that car accidents, airplane crashes, faulty drugs, or bad food were not. They were harmful to a set of individuals within society of course, but they were not harmful to society as a whole. AI is a fundamental existential risk for human civilisation, and I don’t think people really appreciate that.”

This view is one that’s famously echoed by Hawking, who says that while AI could lead to the eradication of disease and poverty and the conquest of climate change, it could also bring about all sorts of things we don’t like such as autonomous weapons, economic disruption and machines that develop a will of their own, in conflict with humanity. “In short, the rise of powerful AI will be either the best, or the worst thing, ever to happen to humanity. We don’t yet know which.”

Alternative view

One tech entrepreneur who holds the opposite view of Musk and Hawking is Facebook’s Mark Zuckerberg, who’s very bullish on AI. Last month, he conducted a Facebook Livestream where he took questions from the public and one of the topics touched upon was AI.

Zuckerberg said: “I have pretty strong opinions on this. I’m really optimistic. I think you can build things, and the world gets better. But with AI especially, I’m really optimistic, and I think that people who are naysayers and kind of try to drum up these doomsday scenarios are... I just don’t understand it, it’s really negative. And, in some ways I think it’s pretty irresponsible. Because in the next five to 10 years, AI is going to deliver so many improvements in the quality of our lives.”

In his livestream, Zuckerberg highlighted some of the ways in which AI can keep people safe, such as helping to diagnose diseases more accurately and enhancing the safety of travel through self-driving cars. In contrast to Musk, he doesn’t believe in purposefully slowing down the development of AI. “I have a hard time wrapping my head around that because if you’re arguing against AI, then you’re arguing against safer cars that aren’t going to have accidents. And, you’re arguing against being able to diagnose people when they’re sick. I just don’t see how, in good conscience, some people can do that.”

In response, Musk, whose company is in the business of building self-driving cars, replied in a tweet: “I’ve talked to Mark about this. His understanding of the subject is limited.”

So, who’s right: Musk or Zuckerberg? Actually the two of them are talking about two different aspects of AI. So, it’s like comparing apples with oranges. Zuckerberg is talking about using AI for very specific purposes, for example in the medical line or transportation field. Musk is talking about what’s called artificial general intelligence, which is more like the type of AI you see in movies. He’s not talking about the ability to crunch massive amounts of data in order to fulfil a specific task but about systems that have the ability to plan, create and even imagine — something pretty close to achieving sentience or consciousness.

Musk fears that this would happen if AI development is unregulated but many scientists say that we’re still far from coming even close to that. No doubt, computers have been shown to beat human players in chess and the game of Go (an ancient Eastern game of strategy). But even their programmers will concede that those are feats of raw computing power rather than intelligence.

Importance of a ‘kill’ switch

Interestingly though, something happened in the Facebook AI Research Lab (FAIR) in June that demonstrated how potentially smart computers can become. FAIR researchers were stunned to find its AI agents or “chatbots” had developed their own language — without any human input — to make communication more efficient.

English is a rich language that evolved organically over the centuries and apparently, Facebook’s chatbot system found that some phrases in English weren’t necessary for communication. So, it diverged from its basic training in English and proceeded to develop a language that sounds like gibberish to humans but could be easily understood by other AI agents or chatbots.

Facebook decided to pull the plug on this new language and had its researchers reprogramme the chatbots to use normal English. This seems to be the sensible thing to do but if programmes are able to communicate with each other through self-developed languages that make them more efficient, isn’t that a good thing?

Arguably it is but if left unfettered, there’s the real risk that the AI-generated language could become so complex that at some point the programmers might no longer be able to figure out what the programmes are saying to each other. One doesn’t need to be a science fiction movie buff to see the dangers of that.

Perhaps the single most important thing to learn from this Facebook chatbot episode is that it’s always important to have an “off” switch that can’t be over-ridden by the AI system. If Tony Stark had built something like that into Ultron, we wouldn’t have had an Avengers movie. But in the real world, a kill switch is something absolutely necessary as we continue to develop smarter and smarter computers. I’m sure both Zuckerberg and Musk would be in agreement with that point.

Sunday
Jul302017

Warding off dementia

As we age, all kinds of diseases can start to crop up. This isn’t to say young people can’t get afflicted with horrible diseases like cancer, diabetes, heart disease and so on but many of these things do tend to crop up when we grow older.

Dementia is another affliction that’s commonly associated with age. Research tells us that a lot of it has to do with genetics and other factors that are beyond our control. But a recent report by The Lancet Commission on Dementia Prevention, Intervention and Care says that at least 35 per cent of dementia cases can be traced to lifestyle factors that we have the power to modify.

This study had brought together 24 international experts to review existing dementia research in order to provide recommendations for treating and preventing dementia.

According to a survey in 2015, there were about 47 million people around the world living with dementia (including Alzheimer’s which is a form of dementia). With better healthcare people are now living longer than ever and as the population ages, that figure will only increase. It’s estimated that the number of people afflicted with dementia will rise to 115 million by 2050.

Besides the huge healthcare costs — the global estimate in 2015 was US$818 billion — there are tremendous social costs incurred by family members of dementia patients. It’s a common enough disease that most of us know someone who has dementia — it could be a parent, a relative, a friend — and we can clearly see how difficult and heart-breaking this situation can be. For sure, we don’t want to be afflicted with dementia ourselves.

While dementia is not entirely preventable and there’s currently no drug treatment to cure it, the good news is that there are behavioural and lifestyle changes that can significantly improve our chances of warding off dementia.

Preventive measures

According to the Lancet report, about one-third of dementia cases can be reduced by taking heed of nine modifiable risk factors through various stages of life that can affect the potential for developing the disease.

These factors are staying in school until over the age of 15; exercising; reducing depression and social isolation later in life; avoiding hearing loss in mid-life, not smoking; and reducing high blood pressure, obesity and diabetes.

In short, it’s good to increase education, physical activity and social contact; and minimise (or if possible, eliminate) hearing loss, smoking, depression, hypertension, obesity and diabetes.

It’s worth mentioning that these factors do not carry equal weight. Some are more impactful than others.

For example, one of the most impactful factors is one that researchers had not identified before, which is hearing loss. They now estimate that reducing hearing loss in mid-life would also reduce the number of dementia cases by as much as 9 per cent. You might be wondering what hearing loss has to do with dementia. While there’s no certainty on it yet, the researchers believe that it may have something to do with the social isolation that those with hearing loss go through when they lose their ability to hear well. As such, people should take care not to listen to music too loudly on their headphones. It could haunt us later in life in ways that most of us won’t expect.

The second biggest factor is education. The researchers say that increasing education in early life (defined as studying until over the age of 15) can help reduce dementia by 8 per cent. It’s believed that education and other mentally stimulating tasks help the brain to build up its neural networks (or “cognitive reserve” as the researchers put it) which will be useful for allowing the brain to continue to function well even when it starts to decline due to age.

The third preventable major factor that can help reduce dementia has to do with smoking. We all know smoking is bad for health physically. Now we know it affects negatively mentally as well. Dementia can be reduced by as much as 5 per cent if all people stopped smoking, the researchers believe. Smoking negatively affects heart health and this in returns affects brain health. The healthier your body is, the healthier your brain will be. It’s as simple as that.

It’s important to point out that even if we take heed of all nine factors, it doesn’t mean we can definitely stave off dementia. In fact, some 65 per cent of dementia cases are not preventable no matter what precautions are taken. But we should take heart in the fact that we now have a better understanding of what to do more off and what to reduce or cut out.

For those who think it’s too troublesome to remember all nine factors, let me reduce it to three simple things: stay physically and mentally active and watch what you eat.

Other considerations

Why physical exercise is important is quite straightforward. According to Gary Small, director of UCLA’s Longevity Centre and author of The Alzheimer’s Prevention Programme, when your heart is really pumping, more nutrients and oxygen get delivered to your brain. The body also secretes protective chemicals during physical activity, including a protein called brain-derived neurotrophic factor, which is believed to spark the growth of neurons. “Exercise can’t guarantee that you won’t get Alzheimer’s, of course,” he says. “But the hope is to delay the disease long enough so that you never experience symptoms in your lifetime.”

The importance of mental exercise is very obvious too. The more we work out our brain, the fitter it stays. But mental exercise doesn’t have to mean doing puzzles and brain quizzes. It can be as simple as trying out new things such as trying out new routes to get home, according to UCLA’s Small. Generally, anything that gets your brain working is good, he says. Repetitive mental exercises aren’t that helpful though. Once a task becomes repetitive, the brain work involved becomes more rote, which means there’s less neural activity going on.

Food plays a big role in health so it makes sense to eat the right things and avoid too much junk food. What you drink can make a difference too. It’s best to avoid alcohol. Although red wine has some anti-oxidants that can be good for your heart, there’s too much bad that comes with the good. No doctor will recommend consuming alcohol on health grounds as alcohol contributes to dozens of negative medical conditions including various cancers, high blood pressure, liver cirrhosis and not to mention depression.

There’s good news if you like coffee and tea though. Both these beverages seem to be good for warding off dementia. A 2009 study done in Finland found that subjects who regularly drank coffee had a 65 per cent lower risk of dementia and Alzheimer’s. The researchers for that study followed the drinking habits of 1,400 coffee drinkers for more than two decades and found one group that seemed to benefit the most: those who’d been drinking three to five cups of coffee a day in their 40s and 50s.

More recent research from Singapore has found that drinking black, green or oolong tea can help reduce the risk of dementia in older people by 50 per cent. And for those who were genetically at risk of Alzheimer’s disease (those who carry the gene APOE e4) the risk was reduced even further by 86 per cent. The study involved 957 Chinese seniors aged at least 55 years old who regularly drank tea.

“Tea is one of the most widely consumed beverages in the world,” said Feng Lei, the study’s lead author from the National University of Singapore. “The data from our study suggests that a simple and inexpensive lifestyle measure such as daily tea drinking can reduce a person’s risk of developing neurocognitive disorders in late life.”

So, stay active physically and mentally, eat sensible meals and drink your coffee or tea. Such habits will go a long way towards reducing the risk of dementia, a disease that greatly impairs the quality of life not just for those who are afflicted but also their family members and loved ones.

Sunday
Jul232017

Meat without animals

ONE of the consequences of having to produce meat to feed billions of people around the world is the rise of factory farming, where animals are reared in cramped cages, pumped full of antibiotics and fed hormones to make them grow faster. They’re slaughtered the moment they’re big enough to be processed for their meat.

This method of meat production is cost-efficient but inflicts tremendous misery on the animals. If you have any doubt about that just Google “battery farming” or type in those keywords on YouTube and you can see for yourself what it’s like.

It also inflicts enormous damage on the environment. According to the United Nations, meat production accounts for 18 per cent of global greenhouse gas emissions. To get a sense of how bad that is, that’s a higher figure than what’s attributed to all the world’s transportation vehicles combined!

MEATLESS MEAT

The good news is that advances in technology will soon make it possible for us to have meat without having to rear and kill animals. Actually the technology is already here but it’s just not quite ready for prime time yet. But it will be — very soon.

There are two ways to have meat without animals. The first one, the conventional approach, is to have vegetarian meat made out of plant material. Of course, mock meat has been around for a long time. Go to any Chinese vegetarian restaurant and you can find plenty of fake meat options on the menu.

They even look like real meat but they don’t taste anything like it. Real meat lovers wouldn’t touch mock meat with a 10-foot pole. If you want to have really successful mock meat, it’s got to be similar enough to real meat so much so carnivores will actually want to eat it.

Impossible, you say? Well, a company in the US, aptly called Impossible Foods, has developed a plant-based burger patty that’s said to be indistinguishable from the real thing. In taste tests conducted by the company, which includes offering free burgers to construction workers, it was found that many who tried the burger couldn’t tell that it was not actual beef.

Founded by Patrick Brown, a former biochemistry professor at Stanford University, the company sought to create mock meat that appeals not just to vegetarians but also to the meat-eating mass market. Brown has famously said that his target market are not vegetarians but meat eaters.

To achieve this feat, his team of scientists did research for five years to figure out what made meat taste like it does. The answer is something called “heme”, a compound that contains iron and is found in animal flesh. Apparently, it’s heme that gives red meat its taste and colour. They needed to mimic that.

The heme found in animals is called myoglobin but there’s a plant-based version called leghemoglobin, which can be obtained from soy plants. When added to the mock meat, it makes it smell, taste, and look like real meat.

Impossible Foods is a serious venture that has raised US$182 million (RM780 million) to date. It’s building a factory that can churn out up to four million burger patties per month once the site is fully functional by the end of this year.

If you’re one who prefers seafood over meat, don’t worry. There’s a mock seafood start-up called New Wave Foods that’s looking out for your business too. In line with its slogan, “We disrupt seafood, not oceans”, New Wave is taking a plant-based approach to producing “seafood”.

Its first product is a mock shrimp made from algae oil and pea protein. The company says that it will be ready to launch this product commercially by the end of next year.

ANIMAL-LESS MEAT INDUSTRY

The newer, more radical approach to animal-less meat is one which involves growing actual meat in a lab. This sounds like science fiction but the technology is already a few years old. It was in 2013 when Dr Mark Post, a Holland-based researcher, introduced the world to the first lab-grown hamburger patty.

With the world’s press in attendance, Austrian food scientist Hanni Rutzler was invited to sample the burger. Her assessment: “intense taste” but “not that juicy”.

That single burger cost a whopping US$325,000 to produce. Rapid advances in biotechnology have brought that price down dramatically to the point that a lab-grown meat is on the verge of being commercially viable.

A company called Memphis Meats says it’s able to produce chicken meat, duck meat and beef meatballs from animal cells in a lab. It estimates that its products will be able to go to market by 2021.

Another company called Hampton Creek, which is known for vegan mayonnaise (no eggs are used), says it too is working on lab-made meat. Dr Post, the guy who introduced the lab-grown burger in 2013, also has his own little start up called Mosa Meats which is working on the same concept. With all these rival companies spurring each other on, it looks like we could have a viable animal-less meat industry sooner rather than later.

And just as there’s a seafood counterpart to mock meat, there’s also a seafood counterpart to animal-less meat. Appropriately named Finless Foods, this company is looking to make cultured tuna from fish cells.

Says the company: “Money is being poured into creating efficient aquaculture systems, to grow fish in tanks on land for human consumption. While this is a move in the right direction, if we’re going to make this system as efficient as possible we need to rethink things from the bottom up.”

They add: “Aquaculture is a system of inputs and outputs, why would we have our expensive food inputs create energy for the fish only to have that energy diverted into things we don’t need, like swimming or having a heartbeat? Why can’t we have a system that only puts energy into growing the parts that people want?”

WORLD CHANGING

Why not indeed? How the company plans to do this is by using a combination of established and cutting-edge cell culture techniques. “We’ll then design a cheap and efficient growth media for this cell line that will allow our cells to grow quickly,” the company says. “Once we have this, we will lay the cells out on a structure that’ll shape them to both look and have the texture of real fish meat, because it will be — on a cellular level — real fish meat.”

Whether you’re into mock or animal-less versions of meat and seafood, you’ll soon be able to buy all these remarkable products. Some of the items will be ready as soon as the end of this year and some might take a few years more. But they’re coming.

For sure, in the initial years, the prices of these items will be higher than the real thing. As such there won’t be mass adoption. Early adopters will be those who care about animal welfare and the environment — not those whose main consideration is price. So, it will be a niche market for a while.

Over time though, costs will naturally drop and when it reaches a point where it’s actually cheaper than the real stuff, that’s when you’ll have your tipping point. Only then will a mass consumer market for these products emerge. When this happens — it may take decades but it will be within our lifetime — it will be truly world-changing.

Sunday
Jul162017

Five ways tertiary education will change

BACK when I went to university, there weren’t many choices when it came to tertiary education. If you weren’t able to get into a public university, you’d have to apply to study abroad. The situation has since changed dramatically.

 

Today, there are many private colleges and universities. But the tertiary education process is still largely the same. You select a field of study, you attend classes and make notes, you study and take exams, you graduate and with that paper qualification you apply for a job.

This is a tried and true way to secure gainful employment. But two things are happening that’s changing that dynamic for tertiary education. Firstly, the nature of jobs is changing rapidly and the current system isn’t churning out suitable graduates. Secondly, as it does with other facets of life, technology is causing major disruptions that will alter the way tertiary education is delivered.

Education advocates have been saying for decades that the education system needs to change. And change is finally starting to happen though it’s not the government that will take the lead but market forces.

Here are five predictions on how education will change over the course of the next decade.

1. Many options

It used to be that the educational institutions determine the options available to students. But increasingly, it has become a students’ market and they’re now in a much better position to set their own educational agenda. Online education, in particular, will give students far greater choices than before.

Students will be able to carve out a study programme that allows them to study at their own pace, anywhere they want. Best of all, they’re not limited to one provider. There’s no reason why a student can’t take courses from different online institutions and learn from a diverse range of providers.

Universities used to be all local but today we have several foreign universities with local campuses. That takes a huge amount of investment, though. Those with strong online arms can become global universities without actually having a campus in the countries they offer their courses in.

2. Degrees will be less important

Growing up, we have heard the mantra that “paper qualifications” are necessary to get a good job. While traditional degrees will still be needed for those careers which are strictly regulated like medicine, accountancy and law, for many of today’s jobs, however, an actual university degree may not be necessary. You don’t have to be an English or Journalism major to work as a writer (I don’t have either of those degrees).

Similarly, you don’t have to have a degree in graphic design to work in that field; or a culinary degree to become a successful chef. It’s how good you are that counts. And this will be increasingly true with most jobs.

This doesn’t mean that people don’t have to get trained anymore. It just doesn’t have to be framed in very narrow terms like a Bachelor’s Degree. In fact, for certain professions, it’s better not to go the traditional route to get an education. For example, a person who wants to work in social media marketing would do better to take online courses, which are constantly updated, than a university marketing course that cannot possibly be as up-to-date as the online ones.

These courses may not confer a degree per se but they certainly equip the student with the necessary knowledge to run successful social media campaigns for brands.

3. Continuous learning

A university education used to be just for young people. They’d usually be in their late teens or early 20s. They’re unlikely to be in their 30s and certainly not in their 40s. But why should that be? The reason in the past was that only young people with few obligations could afford to devote four years of their lives to attending classes. But that’s all changing with the advent of online courses. Now, anyone at any age can pick up new skill sets. And this is necessary because of how jobs are evolving.

Old industries are fading away while new ones are starting to boom. An example of the latter is the self-driving car industry. A friend, Kegan Gan, who is a father of three and works as an app developer, is taking a “Nanodegree” course from Udacity (www.udacity.com) that will train him to become a self-driving car engineer.

The course, which deals with topics like machine learning, computer vision, vehicle kinematics, sensor fusion and automotive hardware, was designed in collaboration with some of the most innovative brands in this area including Mercedes-Benz, Uber, BMW and McLaren. You could say he’s going back to school albeit in an online way.

Education should no longer be viewed as a one-time experience that people go through in their youth but a continuous journey of acquiring knowledge and skill sets to keep them relevant and marketable in an ever-changing job landscape.

There are sites like Udacity, which is nominally academic but very industry-focused.

4. Online learning

I’m a big fan of online learning, of which there are many types. Some are very informal and less academic in nature. These include Lynda.com and Udemy which offer practical instruction and whose teachers are usually drawn from the industry. You also have the more academic-oriented types like Khan Academy and Coursera, which focus more on academic topics. Then there are sites like Udacity, which is nominally academic but very industry-focused.

Although for sure there’s something to be said about in-person instruction, in many ways online courses are superior. For one thing, it gives you access to some of the best instructors in the world, something that would be hard — and certainly very expensive — to obtain in person. For example, I subscribe to an online judo instruction site called Superstarjudo.com which delivers video lessons by former world and Olympic champions. I get to learn from the best, watching them demonstrate their techniques in high definition, slow motion and from multiple angles. It’s even better than attending a live seminar where you might miss something because it happened too fast.

5 The hybrid institution

The rise of online education systems won’t render physical institutions obsolete. There’ll always be a need for university campuses for a variety of reasons. Students meeting up to work on projects together is an important part of the university experience. Lab work still needs a physical presence; extra-curricular activities too.

Don’t forget, going to university has never always been about studies only. It’s at university that you get to meet people from all walks of life, from different social, economic and religious backgrounds — far more so than you would when you enter the workforce. And it’s at universities that you form the early beginnings of your future business networks. Some of the people you meet in university could be the ones you work with or do business with in the future.

So, the physical institution is useful and important for a student’s overall development. That’s why going forward, more and more institutions will adopt a hybrid model whereby they offer some instruction through digital and online means but have on-campus components to facilitate for things that cannot be done online.

Revolutionising education is something long-talked about in theoretical terms with very little change taking effect due to the “If ain’t broke, why fix it?” mentality. This kept tertiary education, in particular, stuck in limbo for decades. But the problem is that today, the old way is broke and will clearly not be able to cater for the rapidly changing global economy.

For sure, some institutions will falter and experience the “Kodak moment”. There’ll be some casualties among those which either cannot or refuse to keep pace with the changes. I suspect though that many will rise to the challenge, seize the opportunities that digital transformation of education can offer and thrive in the new environment.