People always want to know what the leaders of technology companies have to say on all sorts of topics. MSNBC realized this and has a limited-run series called "Revolution" where they speak to different technology CEOs.
Tonight's was Apple CEO Tim Cook (the first one was Google CEO Sundar Pachai and YouTube CEO Susan Wojcicki). It was a serendipitously-timed interview, coming on the heels of revelations about potentially ALL of Facebook's user profiles being inappropriately (if not unlawfully) being made available to groups like the now-infamous Cambridge Analytica (check out my thoughts on Mark Zuckerberg's Senate testimony here).
Mr. Cook spoke about several different subjects in the course of the 1 hour special, focusing mostly on the role technology plays in the evolution of employment and education. But he also dipped his toes into data, privacy, policy, and politics.
Naturally, a significant part of the conversation focused on the way in which technology is changing the employment landscape, both in the U.S. and around the world. Like many tech folks, Mr. Cook expressed the now established observation that jobs will be displaced, and others will be created. Also like many in tech, including myself, he believes that more will be created than displaced. The question, then, is how to make the workforce whose jobs were displaced by technology ready to start jobs created by technology.
It's a long-standing and vexing problem. As host Chris Hayes pointed out, there have been efforts at building retraining programs focused on technology jobs since the Clinton administration (which was over 20 years ago now), but none have been able to have any truly transformative effect on the workforce of the moment (that is, those currently in the workforce, not those who are still being educated to enter the workforce in the future). Mr. Cook believes - and I agree - that business needs to take a more active role in this.
A common theme always expressed when talking about technology and employment is the idea that people need to become comfortable with the idea of lifelong learning. Where 40 or 50 years ago a person could complete high school and hold a steady job until retirement, today's technology-driven, fast-paced world is not so forgiving.
Mr. Cook believes that the technology industry should take a more active role in this. I agree with the sentiment. However, the reality has been that technology companies are more interested in unicorns - those magical creatures with all of the skills necessary to stroll into a new position and create something which turns a profit within 6 months to a year - than in investing in the process of providing supplemental on the job training for those who may be lacking some skills experience. Perhaps this is a place for the technology industry to start doing something and stop paying lip service to this problem. If companies were willing to invest in training and retraining, rather than endlessly seeking unfathomably expensive unicorns, they could possibly fill the jobs gap (currently estimated at nearly 500K tech jobs and expected to climb to nearly 2M), and possibly spend less money long-term.
There is a part of what each of us does that will eventually be automated. As I talked about in my post on AI in Medicine, I think this is a good thing. Eliminating tedious tasks gives people more free time, and free time is the catalyst for all of the most important societal and technological revolutions in human history.
However, that doesn't mean we can be complacent will all of our new-found spare time. Eventually, other parts of our jobs will be automated, and if we don't want to be completely automated away, we have to be willing to learn new skills continuously. It's not just about refining abilities, it's about adding abilities to your resume.
Everyone in the United States gets the warm fuzzies when they can buy a product that says "Made in America" on it. Of course, it's well known that Apple products are not assembled in the United States. Mr. Cook addressed this, pointing out that while final assembly doesn't occur in the U.S., many of the components are. From the glass, to the various semiconductors, and the overwhelming majority of the R&D efforts, it's all built in the United States, and then final assembly is completed overseas.
Mr. Cook was openly hostile to Amazon's "beauty pageant" approach to finding a new headquarters, and I tend to agree with him. It does very few people any good to get dozens or hundreds of cities to put together time-consuming, expensive, elaborate pitches to host a new headquarters, factory, or research campus when only one can be selected (a building can't be in multiple states simultaneously, after all. Unless, of course, you're in one of these state-straddling towns like Texarkana, then perhaps you could put half of your building on each side of the state line).
This part of the conversation was perhaps most serendipitous for Tim Cook and Apple. It's not too often you have a nationwide interview where the news timing (referring, of course, to the whole Facebook/Cambridge Analytica debacle) gives a company like Apple the opportunity to simultaneously advertise the "righteousness" of their approach while reveling in the bad press of another technology giant.
It was mostly an advertisement for Apple's "walled garden" approach, but Mr. Cook did say something during this exchange that I thought was interesting: Deep profile graphs created by amalgamating profile data from multiple services (as Facebook and Google do) should not be allowed to exist.
Why? We know that these data graphs can provide extremely useful information, allowing for the creation of effective digital personal assistants and very effective prescriptive analytics. The ability for a service like Alexa and Google Assistant to provide you personalized information in a wide array of contexts is due to these deep, composed profile graphs. The fact that Apple is not a fan of this approach may also explain why Siri is not nearly as useful in this context as Alexa and Assistant.
But Apple believes there is a very real downside. The ability to make inferences about a person on subjects not explicitly within the deep graph profile. Artificial Intelligence - and Deep Learning specifically - have made the process of finding the slightest trends in oceans of data relatively easy, and at time to a frightening degree. AI has given those with access to these deep graph profiles not only the ability to help you plan your day, make sure you grab your umbrella on the way out of your house, and remind you to pick up a gallon of milk because it's been two weeks since you went shopping last (and you're probably running low), but they can also potentially reveal your inherent biases and political leanings.
And it was this darker capability to infer that which many never explicitly expressed that Cambridge Analytica used to influence people's thoughts and opinions in the political arena. Mr. Cook believes that this was not only immoral, but a violation of people sovereign, individual rights. I don't disagree. What I do disagree with is the prescription - to never allow deep graph profiles to exist. This, in my opinion, is throwing out the baby with the bath water. Like the feat of splitting the atom, there are enormous positive benefits and many dark possibilities. It's up to us to have some form of ethical check on our technology, but that doesn't mean that the technology should not be allowed to exist.
Apple is no stranger to the classroom. When I was in Kindergarten (a "few" years ago...), our school had a handful of Apple II computers for students to use for learning/educational games and word processing. Apple was the first to really put the PC into the classroom, and they've been there ever since.
So it was no surprise that Mr. Cook wanted to emphasize Apple's most recent efforts to augment education with technology and software. The current emphasis is on adding coding to school curriculum, and providing augmented reality (AR) tools to help enhance the non-coding parts of the curriculum.
Should it be required, as Mr. Cook suggests, that everyone learn to code at some proficiency level, just as everyone has to read Shakespeare and learn algebra? His argument for such a change to curriculum is that coding requires problem solving and critical thinking, and that those skills are necessary regardless of eventual profession. I agree that critical thinking and problem solving are the important parts of coding, just as they are the important parts of mathematics, the sciences, meaningful analysis of literature and history, and everything else. I'm all for teaching people to code, but I am concerned that coding would displace another subject (as there is a limited amount of time in the school day). Why is any of these other subjects less important? And can critical thinking and problem solving not be taught in the other subject areas? I would hate for schools to have to choose between coding and another subject that likely doesn't get enough attention already, like economics or government/civics.
In the end, it was a pretty interesting high-level conversation about technology and the tech industry's responsibilities with respect to users and society. I'm glad I took the time to watch it. If you didn't happen to catch it on MSNBC when it aired, you can listen to the entire interview (don't worry, you won't miss anything by not having video) below or on the MSNBC Website.
As always, feel free to leave a comment below, subscribe to my blog on the right, or follow me on Twitter and LinkedIn.
Tonight's was Apple CEO Tim Cook (the first one was Google CEO Sundar Pachai and YouTube CEO Susan Wojcicki). It was a serendipitously-timed interview, coming on the heels of revelations about potentially ALL of Facebook's user profiles being inappropriately (if not unlawfully) being made available to groups like the now-infamous Cambridge Analytica (check out my thoughts on Mark Zuckerberg's Senate testimony here).
Mr. Cook spoke about several different subjects in the course of the 1 hour special, focusing mostly on the role technology plays in the evolution of employment and education. But he also dipped his toes into data, privacy, policy, and politics.
Jobs
Naturally, a significant part of the conversation focused on the way in which technology is changing the employment landscape, both in the U.S. and around the world. Like many tech folks, Mr. Cook expressed the now established observation that jobs will be displaced, and others will be created. Also like many in tech, including myself, he believes that more will be created than displaced. The question, then, is how to make the workforce whose jobs were displaced by technology ready to start jobs created by technology.
It's a long-standing and vexing problem. As host Chris Hayes pointed out, there have been efforts at building retraining programs focused on technology jobs since the Clinton administration (which was over 20 years ago now), but none have been able to have any truly transformative effect on the workforce of the moment (that is, those currently in the workforce, not those who are still being educated to enter the workforce in the future). Mr. Cook believes - and I agree - that business needs to take a more active role in this.
(Re)training vs. Unicorns
A common theme always expressed when talking about technology and employment is the idea that people need to become comfortable with the idea of lifelong learning. Where 40 or 50 years ago a person could complete high school and hold a steady job until retirement, today's technology-driven, fast-paced world is not so forgiving.
Mr. Cook believes that the technology industry should take a more active role in this. I agree with the sentiment. However, the reality has been that technology companies are more interested in unicorns - those magical creatures with all of the skills necessary to stroll into a new position and create something which turns a profit within 6 months to a year - than in investing in the process of providing supplemental on the job training for those who may be lacking some skills experience. Perhaps this is a place for the technology industry to start doing something and stop paying lip service to this problem. If companies were willing to invest in training and retraining, rather than endlessly seeking unfathomably expensive unicorns, they could possibly fill the jobs gap (currently estimated at nearly 500K tech jobs and expected to climb to nearly 2M), and possibly spend less money long-term.
Say Goodbye to Certain Tasks
There is a part of what each of us does that will eventually be automated. As I talked about in my post on AI in Medicine, I think this is a good thing. Eliminating tedious tasks gives people more free time, and free time is the catalyst for all of the most important societal and technological revolutions in human history.
However, that doesn't mean we can be complacent will all of our new-found spare time. Eventually, other parts of our jobs will be automated, and if we don't want to be completely automated away, we have to be willing to learn new skills continuously. It's not just about refining abilities, it's about adding abilities to your resume.
This Whole "Made in America" Business
Everyone in the United States gets the warm fuzzies when they can buy a product that says "Made in America" on it. Of course, it's well known that Apple products are not assembled in the United States. Mr. Cook addressed this, pointing out that while final assembly doesn't occur in the U.S., many of the components are. From the glass, to the various semiconductors, and the overwhelming majority of the R&D efforts, it's all built in the United States, and then final assembly is completed overseas.
The Beauty Pageant Model
Mr. Cook was openly hostile to Amazon's "beauty pageant" approach to finding a new headquarters, and I tend to agree with him. It does very few people any good to get dozens or hundreds of cities to put together time-consuming, expensive, elaborate pitches to host a new headquarters, factory, or research campus when only one can be selected (a building can't be in multiple states simultaneously, after all. Unless, of course, you're in one of these state-straddling towns like Texarkana, then perhaps you could put half of your building on each side of the state line).
Data and Privacy
This part of the conversation was perhaps most serendipitous for Tim Cook and Apple. It's not too often you have a nationwide interview where the news timing (referring, of course, to the whole Facebook/Cambridge Analytica debacle) gives a company like Apple the opportunity to simultaneously advertise the "righteousness" of their approach while reveling in the bad press of another technology giant.
It was mostly an advertisement for Apple's "walled garden" approach, but Mr. Cook did say something during this exchange that I thought was interesting: Deep profile graphs created by amalgamating profile data from multiple services (as Facebook and Google do) should not be allowed to exist.
Why? We know that these data graphs can provide extremely useful information, allowing for the creation of effective digital personal assistants and very effective prescriptive analytics. The ability for a service like Alexa and Google Assistant to provide you personalized information in a wide array of contexts is due to these deep, composed profile graphs. The fact that Apple is not a fan of this approach may also explain why Siri is not nearly as useful in this context as Alexa and Assistant.
But Apple believes there is a very real downside. The ability to make inferences about a person on subjects not explicitly within the deep graph profile. Artificial Intelligence - and Deep Learning specifically - have made the process of finding the slightest trends in oceans of data relatively easy, and at time to a frightening degree. AI has given those with access to these deep graph profiles not only the ability to help you plan your day, make sure you grab your umbrella on the way out of your house, and remind you to pick up a gallon of milk because it's been two weeks since you went shopping last (and you're probably running low), but they can also potentially reveal your inherent biases and political leanings.
And it was this darker capability to infer that which many never explicitly expressed that Cambridge Analytica used to influence people's thoughts and opinions in the political arena. Mr. Cook believes that this was not only immoral, but a violation of people sovereign, individual rights. I don't disagree. What I do disagree with is the prescription - to never allow deep graph profiles to exist. This, in my opinion, is throwing out the baby with the bath water. Like the feat of splitting the atom, there are enormous positive benefits and many dark possibilities. It's up to us to have some form of ethical check on our technology, but that doesn't mean that the technology should not be allowed to exist.
Technology's Role in Education
Apple is no stranger to the classroom. When I was in Kindergarten (a "few" years ago...), our school had a handful of Apple II computers for students to use for learning/educational games and word processing. Apple was the first to really put the PC into the classroom, and they've been there ever since.
So it was no surprise that Mr. Cook wanted to emphasize Apple's most recent efforts to augment education with technology and software. The current emphasis is on adding coding to school curriculum, and providing augmented reality (AR) tools to help enhance the non-coding parts of the curriculum.
Should it be required, as Mr. Cook suggests, that everyone learn to code at some proficiency level, just as everyone has to read Shakespeare and learn algebra? His argument for such a change to curriculum is that coding requires problem solving and critical thinking, and that those skills are necessary regardless of eventual profession. I agree that critical thinking and problem solving are the important parts of coding, just as they are the important parts of mathematics, the sciences, meaningful analysis of literature and history, and everything else. I'm all for teaching people to code, but I am concerned that coding would displace another subject (as there is a limited amount of time in the school day). Why is any of these other subjects less important? And can critical thinking and problem solving not be taught in the other subject areas? I would hate for schools to have to choose between coding and another subject that likely doesn't get enough attention already, like economics or government/civics.
In the end, it was a pretty interesting high-level conversation about technology and the tech industry's responsibilities with respect to users and society. I'm glad I took the time to watch it. If you didn't happen to catch it on MSNBC when it aired, you can listen to the entire interview (don't worry, you won't miss anything by not having video) below or on the MSNBC Website.
As always, feel free to leave a comment below, subscribe to my blog on the right, or follow me on Twitter and LinkedIn.