Science is an ivory tower
Communicating science for the masses
by Deborah Berebichez
Dr. Berebichez is a physicist, TV host, and STEM advocate. She co-hosts Discovery’s Outrageous Acts of Science, where she uses her physics background to explain the science behind extraordinary engineering feats. We sat down with her to talk about the future of science education.
Has social media made it easier for the scientific community to engage with a wider audience, making complex STEM topics more accessible?
Yes! Social media has given many scientists the opportunity to become science communicators which is great. We now have thousands of people telling us what their research is about and how it impacts our daily lives. But social media has created its own challenges. The ubiquity of scientific information has also left the public confused. The vast amount of information on social media can sometimes appear contradictory or at best, it’s hard to interpret the importance of certain results. One study in France claims drinking coffee is good for you, and a month later a different study in Italy claims the opposite. What the public isn’t told is that the devil is in the details. Perhaps the French study was done with only middle-aged women with a certain diet and the one in Italy used older men with a very different diet and that’s why the results were different. And some communicators, in their eagerness to tell a good story, oversimplify or even omit these details. So there is a gap: we need more people translating the full complexities of science in lay terms. It’s the combination of studies or “meta-studies” that provide a complete picture and we need more people who can present them in a clear and concise way. We already have some extremely talented science communicators doing exactly that and we should absolutely support them!
What is the recipe to engage an audience while keeping the complexity of science?
The formula is to have someone very passionate about explaining science, especially if we want to grab the attention of a public that is already burdened with a lot of distracting information. At the same time, the communicator needs to deeply understand both the big picture and the technical details of the research. The perfect balance between complexity and engagement can be achieved. But it’s not easy. Like famous physicist Richard Feynman once said “if you cannot explain your research to your own mother, it means YOU don’t understand it.”
Do you think our fear of the future and AI is based on the fact that people just don’t really get it, or it hasn’t been explained well by the STEM community?
I think we are going through an uncertain time, economically and politically worldwide, and people are afraid because they are seeing a lot of change and the future seems uncertain. I think most people, when they think of AI they are really thinking of what we call Artificial General Intelligence (AGI) which is the type of AI that can carry out any cognitive function in the way a human can. This technology is not here yet. Far from it. Today, what we have is called Narrow AI in applications such as IBM Watson, Siri, Alexa, and others. The main difference between AGI and Narrow AI is autonomy and objective-setting. That is, Siri only works if one asks it a question; it can’t automatically ask about the weather on its own. This doesn’t mean that AGI will not be here in the future. We have to accept that the world is changing without giving into fear. Of course, it’s natural to be afraid when we can no longer rely on the systems and the rules that we grew up with — especially the older generation. But don’t let fear lead you into the future, let curiosity lead you into it!
How is technology changing the way we need to educate our young people?
There is a certain computing literacy that is now required of young people. Just like you need to learn a language in school, you also need to know a little bit about computer science. I help run a company called Metis that teaches people AI, machine learning, and data science, and when we talk to some large Fortune 500 companies, we notice that in the beginning a few of them are searching for direction when it comes to AI — they ask what is this for, why do we need it? Quickly, companies are discovering that automating certain tasks doesn’t mean that they can get rid of 10 workers and replace them with a program. What it means is that those 10 workers can now free up a few hours a day to perform other tasks that require more complex thinking such as interpreting interesting data, looking at outliers, and making data-based business decisions. Most of AI is not taking away jobs. On the contrary, it is opening up more opportunities because there is a massive need for a deeper understanding of what the algorithms are doing and what the results mean.
What should schools do to make sure we are giving people the right skills for the future?
My particular concern is that many schools are teaching a very basic curriculum of coding to satisfy a requirement. But a lot of these curricula make kids memorize some coding rules without thinking of the basic problem at hand and the goals of the code. In this way, they are forgetting that coding is a means to an end, a way to solve problems. Therefore, the most important skill we should learn is critical thinking: how to think independently about all kinds of problems. Not about memorizing rules. A great school helps their students gain skills that last a lifetime: a love of learning, endless curiosity, and critical thinking.