I was frustrated with my students. Day after day I gave them homework assignments to do, and with predictable consistency they returned the next day with the work either incomplete or not even attempted. This was not looking good for them. These were first-year students, and the module I was teaching them, Java programming, was mandatory: they needed to pass to proceed to the second year. At the rate they were going, they were not going to make it. I understood that learning the Java programming language is tough. Some say it is as tough as learning mathematics, but I’ve taught both and can say with confidence that Java is much more challenging for students than mathematics. Keeping in mind the challenging nature of the language, I made every attempt to ensure my students understood the concepts by constantly testing them throughout the lesson. By the end of each lesson, I was confident they did understand the concepts. But this only added to the mystery; if they understood the concepts, why didn’t they complete their assignments? I decided to have one-on-one chats with them, and during those chats a student said something that affected my teaching career forever. “When I need you the most, you are not there.” What he meant was, when he was attempting the homework assignment, he often ran into challenges and needed to ask questions; but because he was doing these at home, he could not forward those questions to me. After much research, I realised that I needed to change my approach. The solution, it turned out, was to adopt the “flipped classroom” model, which basically recommended that learning should be done at home, and homework should be done at school. How would this be possible? The solution was to incorporate technology in the learning and teaching process. I began to make video recordings of my lectures and upload them to YouTube. Then, I asked my students to go through the videos at home with a list of other online resources, and to ensure they understood the concepts thoroughly before coming to class. When they arrived at class the next day, I gave them their assignments that they completed in class. The solution worked like a charm. My students took to this new method like ducks to water and began to perform exceptionally well. Our mutual frustration was over, and I learned a valuable lesson: that the way I approached education needed a major overhaul. The reason my students thrived in the new environment was because I moved away from the traditional, teacher-centric, lecture-based method to a new one; one where the focus of attention was not the teacher, but the student. I was no longer the main actor. I became the “guide on the side” rather than the “sage on the stage”. This is the first major stumbling block in education. We are so entrenched in the traditional mode of education that we fail to realise that it is incompatible with the learning preferences of modern learners. Not only that, but the content is outdated and the methodology inadequate to prepare learners for the world of work in a technology-driven age. 5 C’s of Education What is desperately needed is a new approach to education, one that encompasses the five “Cs” that are compulsory in 21st-century education: choice, collaboration, communication, critical thinking and creativity. In my Java classes I gave my learners choice – the choice to learn the concepts how, when and where they wanted. I also gave them the opportunity to collaborate and communicate with each other: rather than complete the assignments alone, they would do them in pairs or groups. This required them to work collaboratively and to constantly communicate with each other. At the end of each session, groups were required to critique each other’s work, and to justify their own approaches in solving problems. This encouraged critical thinking and creativity. This solution would not have been possible two decades ago, but thanks to technology, it is well within the grasp of every teacher and student. Today, there are hundreds of technology solutions radically transforming education. In light of these advancements in technology, people generally wonder if technology will eventually replace teachers. On the contrary, the role of the teacher will become more important. Teachers in the future will be free to focus on each student, helping them with specific challenges and guiding them along. Most importantly, they will practice the sixth “C” of education, one that no machine will ever be able to replace – compassion. So, my standard response to the question is this: technology will never replace teachers, but teachers who know technology will replace those who don’t.
At our recent workshop at the Valley Trust, we were honored to be joined by 2 very special guests, Iman Malaka, CEO of Tic-It Telecoms, and Thapelo Nthite, Co Founder of Tic-Ai, who took time off their very busy lives and flew in from Cape Town to give our students a crash course in Artificial Intelligence. A few lucky students even got to walk away with with brand new smart phones! The expert knowledge gained by our students will go a long way, not just in terms of learning, but also in terms of the inspiration : to see people who come from ordinary backgrounds, who have gone out into the world, who have faced tremendous odds and have made it for themselves and are now giving back to the communities. “We think it’s very important to expose our youth, from whatever background, to what the world is going through. We want Africans to create solutions for Africa because for a very long time a lot of solutions and advancements in technology have been very far from us and we feel if the youth know what’s out there, what’s possible and what the rest of the world is doing, they can innovate and come up with solutions that can improve and help their direct communities” ~ Thapelo Nthite, Co Founder of Tic-Ai. “This, for us, is an opportunity to share and impart knowledge with the learners and also to teach them that you can use the problems of or the challenges or gaps within your communities to build solutions. So it is African solutions to African Problems” ~ Iman Malaka, CEO of Tic-IT Telecoms.
“Social media is the new way of bragging for those who commit crimes to gain a sense of self-power or self-importance. The audience is larger now and, perhaps, more seductive to those who are committing antisocial acts to fill personal needs of self-aggrandizement,” . When Mark Zuckerberg started Facebook in 2004, there was no way he could have known his little website would become the world’s most popular social network with nearly two billion registered users. He also could not have known his social networking site would be instrumental in one of the most heinous crimes committed in the past century. The world was left shocked and dumbfounded last Friday when a self-described white supremacist, armed with an automatic machine gun, walked into two mosques in Christchurch, New Zealand, and opened fire on the crowd, killing 50 people and wounding dozens of others. Among those killed was three-year-old Mucad Ibrahim. The incident sent shock waves throughout the world. New Zealand is a country that, unlike the US, is not generally known for mass shootings and violent crimes. “It is clear that this can now only be described as a terrorist attack,” New Zealand Prime Minister Jacinda Ardern said soon after the shooting. What made the crime exponentially more horrific was that the perpetrator live streamed it on Facebook for the world to see in graphic, high-definition video. It is difficult to get a grasp on the depravity of this act. “We have a lot of work, and we will keep doing all we can to prevent tragedies like this from happening,” Zuckerberg Facebook reacted swiftly by banning the video, and even worked with competing social networks to quell its spread before it went viral. They released a statement relating to the incident via their newsroom blog, expressing their horror and condolences. “Our hearts go out to the victims, their families and the community affected by the horrific terrorist attacks in Christchurch,” wrote Chris Sonderby, vice-president and deputy general counsel at Facebook. The company has been working closely with the New Zealand police to support their investigation, and are providing an on-the-ground resource for law enforcement authorities. In what is seen as an unprecedented move by the social network, they released detailed statistics about the video to the general public. According to statistics, the video was uploaded by Facebook users 1.5 million times within 24 hours of the attack. Facebook’s artificial intelligence detection systems managed to block 1.2 million of those uploads, but that still left approximately 300 000 copies to slip through the cracks. Facebook took additional measures to curb the video’s reach, such as designating the shootings as a terror attack and working with the Global Internet Forum to Counter Terrorism to report anyone praising or supporting the incident to the relevant law enforcement officials. Other platforms like YouTube were also flooded with uploads of the horrific video. According to a representative from the streaming video service, at one stage copies of the video were being uploaded at a rate of one copy per second – an unprecedented rate for YouTube uploads. The Washington Post reported that YouTube was going out of its way to block the video. Despite this, people still managed to get copies through by re-editing and repackaging it to look like an innocent video. Blocking the video on instant messaging platforms like WhatsApp was much more challenging, with people forwarding and broadcasting it to all their contacts. On platforms such as Reddit and 8chan, where hate speech is common, the video went viral without much resistance. While some people have lauded the likes of Facebook and YouTube for their willingness to block the spread of the video and to co-operate with law officials, others say they had no choice, because they faced major backlash and possible reputational risk because of the incident. The question that researchers and others are increasingly asking is whether the negative consequences of social media will inevitably outweigh any benefits. Facebook, in particular, has received a lot of negative publicity in recent times. Either way, the one thing that comes into question is whether social networks are doing enough to prevent their platforms from being used to spread hate and violence. No one denies the immensely powerful reach of social media and the ability of a simple message, image or video to go global within minutes. It is the information equivalent of a nuclear weapon. Putting the incident into perspective, we can say that Facebook effectively gave a depraved, sick, twisted hate-mongering terrorist a potential live audience of two billion people to spread his hate. This may sound scary, because it really is. Whereas many people hold the social networks responsible for the spread of the horrifying video, I differ. Social media gives us previously- unimaginable power to share positive content and to educate people. After all, I was live on Facebook just a couple of weeks ago, talking to business people and students from around the world about the impact of artificial intelligence on business and careers. If anyone is to blame for the spread of the video, it’s the people who, for some reason or other, felt that seeing 50 innocent human beings being mercilessly shot in their backs was entertaining or worthy of sharing with others. Social media was just the means. Having said that, I believe that social networks have a responsibility to ensure their platforms are never used for evil, and should put all the necessary checks and balances into place before allowing people to post content, especially live video.
“There is a real danger that computers will develop intelligence and take over. We urgently need to develop direct connections to the brain so that computers can add to human intelligence rather than be in opposition” Stephen Hawking We are about to enter a new era in human history, one where the blind will be able to see, the deaf to hear and the disabled to walk again. People with brain diseases such as Alzheimer’s, epilepsy, Parkinson’s and dementia will be cured. This may sound like some kind of biblical premonition but in reality, there are technologies being developed that have the potential to accomplish all those things. Not only that, but these technologies will give humans superpowers: humans will be able to gather, process and calculate data as quickly and efficiently as computers do. Learning will be as simple as downloading data into the brain via a USB cable. Whether one is learning a new skill or a new language, this will be accomplished in minutes. Neuralink’s plans for brain-reading ‘threads’ and a robot to insert them Recently Elon Musk, founder of Tesla and SpaceX, announced a new technology called Neuralink, which will enable the human brain to connect directly into computers, allowing free communication between computer and the brain as if the brain is just another piece of hardware. If this sounds scary, it’s because it is. Whereas past technologies have changed things around us, this one has the potential to transform humanity forever by changing the very essence of who we are. Although Neuralink is a new and revolutionary technology, the whole concept of computers connecting directly to the brain, also known as Brain-Machine-Interaction, or BMI, has been around since the 1950s and a number of devices have been developed. BMIs work by sticking electrodes into the brain and then monitoring and interpreting brain impulses. One such device, the Utah Array, is a brain implant that enables people to control computers and even send text messages using their minds. This technology has been around since 2012. Similarly, a wristband developed by CTRL-Labs in New York City allows the wearer to control a computer without moving. Neuralink is significantly different from other BMIs, and immensely more capable. The Utah Array works with just 256 electrodes, each about half a millimetre thick. On the other hand, a single Neuralink chip has 1000 electrodes, and up to 10 chips can be implanted into a single brain. In other words, Neuralink can insert up to 10000 electrodes into a brain. The electrodes, also called threads, are extremely small: each one is about 10 microns thick. To get an idea of how small that is, a red blood cell is 8 microns wide. Because they are so small, the threads are able to penetrate deep into the brain without causing any damage. Rather than damaging brain tissue and blood vessels, they simply slip in- between them. Their microscopic size gives them another huge advantage: they’re able to communicate with individual brain cells, or neurons. This kind of technology can provide a whole host of benefits, like curing brain diseases. Many brain diseases are caused when brain tissue is damaged. Neuralink could potentially cure these diseases by fixing the motor functions in the brain. Then there’s the ability to control computers and other devices with the mind, which in itself has amazing potential. Some day people could control robot arms and legs attached to their bodies in lieu of limbs they lost. Paraplegics could control their wheelchairs with their minds or, better still, be able to walk again, thanks to Ironman-like exoskeletons they control with their minds. It may be possible to download digital content like books directly into the brain, and even augment the brain by giving it computer-like maths, logic and processing power. By connecting to the internet, Neuralink will bring all the knowledge and information on the internet directly into the brain. Neuralink might also some day cure blindness and deafness by connecting directly into the auditory and visual cortices of the brain, transferring signals from powerful cameras and microphones directly into the brain, giving people superhuman vision and hearing. Taking things even further, there might come a time when, rather than watching movies or playing video games for entertainment, people might be connected directly into fully immersive, digitally-created imaginary worlds, such as in the movie The Matrix. In these worlds, all the senses will be stimulated: sight, hearing, touch and even smell; and everything will feel as real as in real life. Neuralink will also enable scientists to better understand the human brain by sending data about the brain to computers. This data will then be analysed and studied, unravelling some of the mysteries of the brain that have eluded us for centuries. Although we can make dozens of predictions about what Neuralink will be capable of, the truth is, nobody really knows what its true capabilities are. When the brain and Neuralink eventually fuse and adapt to each other, they will undoubtedly develop a kind of synergy that no one is able to predict. Such is the mysterious nature of artificial intelligence, and of our own brains. They are at best unpredictable. What we do know is, Elon Musk wants to perform the first human testing of Neuralink by 2020. What lies beyond, is anybody’s guess.
Established in 1953, The Valley Trust in Botha’s Hill, aims to uplift the Valley of a Thousand Hills community, by eradicating poverty and improving the overall quality of the lives of the community through various health, educational and entrepreneurial projects.The broad objective of the Valley Youth Leadership Development Project is to improve the life chances of youth in the Valley of a Thousand Hills. One of the key focuses of the Trust is the youth. According to the Trust : “Youth in rural areas only have a 4% chance of making it into an institution of further education and training, a 13% chance of finding employment and are likely to live below the poverty line all their adult lives until they become eligible for a government old age grant at age 60” The Valley Youth Leadership Development Programme is aimed at improving the opportunities for the youth of the Valley of a Thousand Hills by fulfilling certain specific objectives : To equip 60 young people with the knowledge and skills to lead through supporting them to form and lead youth groups. To develop the confidence and skills of 60 young people to negotiate access to opportunities for youth through leading youth groups to attain their stated goals. To enhance the capacity of 780 young people to identify and take up opportunities for employment, entrepreneurship and/or further education through exposure and access to resources and opportunities. To improve the health and well being of 780 young people through education and participation in health-focused campaigns. To strengthen civic responsibility in 780 young people through providing opportunities for them to contribute to the development of their communities. IT varsity was established to address the IT skills shortage in South Africa by providing an institution of hardcore learning and direct preparation for the world of work. IT varsity’s vision is to empower 100 students with not just technical skills, but also entrepreneurship skills and a go-getter attitude. That’s where the name Apptrepreneur comes from: it’s a combination of the words ‘App’ and ‘Entrepreneur’. Mission 2020 sees the upskill of 100 learners in App Development in 2019, and placement of those learners into job positions in 2020. Because of the alignment of visions and missions, IT varsity has partnered with the Valley Trust to launch Project Apptrepreneur at Botha’s Hill. The objective of the project is to train 20 disadvantaged students to become app developers. The project, which will run for the rest of the year, will see all students gaining a national certificate in software development, as well as practical mentorship and guidance from IT varsity’s founders and directors, Bilal Kathrada and Maseehullah Kathrada, both of whom are award-winning app developers themselves. Project Apptrepreneur will be instrumental in breaking the cycle of poverty and unemployment!
IT varsity promotes a culture of learning that not only values academic success but elevates and celebrates students who prove they value education by showing academic responsibility, curiosity and are willing to go all out towards achieving their goals. We are excited to announce our Student of the Month, Shaheen Safedien, who hails all the way from Port Elizabeth. Through the Apptrepreneur course, Shaheen is finally realizing his dream to be a part of technology that not only creates products that make life so much simpler, but also has the potential to change the world! After all, “the best way to predict the future is to create it” Between coding and a very busy life, Shaheen took some time to talk to us about himself, his love of coding and the Apptrepreneur course. IT varsity : Well done on being chosen as Student of the Month! Please tell us a bit about yourself. Shaheen : I am from Port Elizabeth, currently working in retail. I started off in 2015 as a store manager where I performed duties to fulfil standard operations that upholds quality service. They say every dream begins with a dreamer. What are your dreams and goals? Shaheen : My dream would be a combination of creating products that are making a difference in the world and getting to share them with everyone out there. A simple code can create something that has the potential to change the world! That is awesome! Please tell us what you think about the Apptrepreneur course and how it is benefitting you? Shaheen : It’s really exciting having to write code and creating different things, I’ve started at nothing, but with the help and support from it Varsity, I have developed a passion for coding, being creative, designing things and continuously improving. Technology is, undoubtedly, the future, and will be here for a very long time, so developers have no worries about the industry disappearing. Being able to see and be a part of technology that makes life easier is so cool. Imagine, we can have food delivered, parcels delivered, transport and a host of services at the touch of a button, thanks to apps like Uber, Mr D, UPS, etc. These apps started with an idea and became a way of life. This is what it Varsity offers with the App course, a way and means to make a difference. What is your ambition and where do you see yourself going with this course? Shaheen : I’d love to go further into this field because there’s always new things coming out and new ideas. Technology sees the need and fulfils it. What advice would you give to other students like yourself out there? Shaheen : I would say in my opinion it’s best to stay focused, it will be yours when you set your mind to it. Be confident and keep practicing by even building new apps and websites from scratch, keep a good pace and not overwork yourself and force too much at once. Enjoy it as it comes and it will flow. Stay cool 😎
Technology has always played a role in warfare, and tech companies can earn a fortune by selling tech to the military. But employees of many tech companies are unhappy about their tech being weaponized. Who’s right? Should tech companies work with governments and militaries knowing that their technology might be used for war? This is a major dilemma that US tech giants like Microsoft and Google are facing. HoloLens Last year, Microsoft won an extremely lucrative $480 million contract to supply the US military with their HoloLens augmented reality headsets. The headsets, which are worn like goggles, basically give the wearer a “heads up” computer display, allowing the user to interact with a computer while on the move. Although the HoloLens systems will initially be used for training purposes by the military, the plan is to eventually use them in combat, providing soldiers with a heads-up display similar to those that fighter pilots have. The headsets will not only provide vital health and other info to soldiers, but will also use artificial intelligence to navigate them through war zones and recognise targeted persons and combatants. Microsoft’s employees are not too thrilled about this decision by the company to work with the military. They prepared a petition protesting against their work being used for military purposes, demanding that Microsoft cancel the deal. “We did not sign up to develop weapons, and we demand a say in how our work is used” The petition, which was published on Twitter, stated that “we did not sign up to develop weapons, and we demand a say in how our work is used”. Microsoft chief executive Satya Nadella defended the company’s decision in a CNN interview, stating that “We made a principled decision that we’re not going to withhold technology from institutions that we have elected in democracies to protect the freedoms we enjoy”. This is part of a wider trend in the tech industry. Thousands of tech workers are voicing their unhappiness about their companies working with the government, and especially the military. Google abandons JEDI Last year, Google cancelled a contract to provide the US government with artificial intelligence software to enhance drone footage. This came after 4 000 Google employees signed a letter of petition against the contract and sent it to Sundar Pichai, the company’s chief executive. “We believe that Google should not be in the business of war,” stated the letter, whose signatures included some of Google’s top engineers. The petition further demanded that Google should announce a policy never to build warfare technology. These protests also led to Google pulling out of a $10 billion bid for a Department of Defence cloud computing contract known as Joint Enterprise Defense Infrastructure, or JEDI. Despite similar protests from their employees, this did not stop competing tech giants like Microsoft, IBM, Amazon and Oracle from continuing to bid for the contract. Amazon chief executive Jeff Bezos confirmed the company’s stance on military contracts in an interview late last year, saying: “If big tech companies are going to turn their back on the US Department of Defence, this country is going to be in trouble.” Similarly, dozens of tech start-ups are also aiming to clinch lucrative military contracts via a new initiative launched by the US Air Force, called “Demo Day”, which is the air force’s way of trying to attract more cutting-edge innovations from the commercial sector. At the first ever Demo Day, over 50 start-ups pitched their ideas, which ranged from artificial intelligence systems to cyber security, 3D mapping and medical tech. When asked to comment about the possibility that their technologies would be used for war, they avoided the minefield by simply stating that they would be happy as long as their tech was used for good. In reality, there is no way to know where their technology will be used, and neither are there any guarantees. “If big tech companies are going to turn their back on the US Department of Defence, this country is going to be in trouble.” – Jeff Bezos The military is, out of necessity, shrouded in layers of secrecy and is not obligated in any way to disclose what it is going to do with the technology. It would be an obvious national security risk if they did. The definition of “good” Then there is the definition of the word “good”, which is a massive grey area. The understanding of what entails a “good” act will undoubtedly vary based on different perspectives, and what may be good for one person is not necessarily good for the next. Of course, as much as the likes of Nadella and Bezos may play the patriotic card, no one can deny the fact that there are definitely capitalist motivations at play, because military contracts are highly lucrative. Ultimately, the question around government and military contracts presents a major conundrum for tech companies of all sizes. Should they yield to the sensitivities of their staff and customers, or should they do what they think is best for their businesses? Either way, they face a public relations minefield that they will need to tread through with extreme caution. But there is another dimension to this debate that most people overlook: that the military has played a key role in development of many of the technologies we take for granted today, and had it not been for the military, it is possible those technologies would not exist today. I will discuss these technologies in the next article.
The year was 1400 BC, and the city of Thebes was the capital and crown jewel of the New Kingdom of the Egyptian empire. Located on the banks of the Nile river, 800km south of the Nile Delta, it was a sprawling metropolis, and the largest city in the world, with a population of 80000. Of course, as has been the pattern throughout human history, civilisations rise and fall; and 1000 years later, the Egyptian Empire had relinquished power to the new superpower of the world, the Roman Empire. The capital of the Roman Empire was the city of Rome which, with its population of 1 million inhabitants, made Thebes look like a little village. Fast-forward to the turn of the 20th century and the accolade of biggest city in the world rested firmly with London, the capital of the British Empire and the world’s economic and military centre. The population of the city at the time stood at nearly 6 million inhabitants, nearly six times the size of Ancient Rome. London was a city almost bursting at its seams: the population was getting too big for the city’s centuries-old infrastructure to maintain. Roads, food and water supply, solid waste disposal, sewers and other basic amenities were simply not coping. But cities got even bigger; much, much bigger. Today, 120 years later, the largest city in the world is Tokyo, with a population that is over six times that of turn-of-the-century London, with a whopping 36 million residents. Tokyo is one of a number of “megacities” that have sprung up all over the world. Other megacities are Delhi (26 million), Shanghai (24 million), Sao Paulo (21 million) and Mumbai (20 million), among others. London does not even appear on the list of top10 largest cities in the world. According to a study by the Global Cities Institute based in Canada, nearly 800 million people live in the world’s largest 101 cities – 11% of the world’s population. But this pales in comparison to what is coming in the rest of this century. According to the study, by the end of this century 25% of the earth’s population, which will hit the 13 billion mark, will reside in urban areas. This will give rise to cities that will be much bigger than anything we’ve seen in human history: cities I like to call mega-megacities. The current urbanisation trend that is sweeping across Africa will continue unabated until, by 2100, the three biggest cities in the world will be in Africa: Lagos, Kinshasa and Dar es Salaam. It is estimated that the population of Lagos will stand at 88.3 million, Kinshasa at 83.5 million, and Dar-es-Salaam will be home to 73.7 million. We are living in a time when the world’s population is experiencing accelerated growth, and the time is now to start planning for the future. We simply cannot afford a “wait-and-see” attitude.