The 103rd TechDay, dedicated to the Cognitive theme, took place on the 25th September.

Prof. Amílcar Cardoso, professor at the Faculty of Science and Technology of the University of Coimbra, was a guest speaker at this event and spoke about cognitive technologies, artificial intelligence (AI), what it is, its uses, and the different types of AI, machine learning (ML), the different types that exist and the interaction between ML, Big Data and performance. He also talked about some of the challenges that this technological area brings, such as interaction, autonomy and creativity, problems on to the lack of regulation in the use of AI, profound misinformation and the society polarization risks.

This was followed by Laura Simas, automation strategic adviser at DocDigitizer, who spoke of the recognition that the data collected in many fields contains more information than was initially thought. As a result, this recognition leads to the issue of efficiently capitalizing these data through the use of AI/ML and without human intervention, in a less error-prone process that frees humans from performing tasks without added value.

Pedro Miguel Neves and Guilherme Cardoso, both from the Technology and Innovation Strategy department, as well as José Nuno Sousa, from Altice Labs’ Operations Support Systems department, spoke about the prediction of problems in telecommunications networks and systems, and presented some use cases of cognitive operations in this type of environment: cognitive network operating centers, cognitive call centers, and cognitive maintenance of telecommunications infrastructures. These use cases allow network and system operations to move from reactive to proactive, anticipating problems and acting to have less impact on users.

This was followed by a discussion panel with all speakers where there was an opportunity to talk about aspects such as the need to moderate expectations about this technology, since it’s not possible to have AI without information architecture (IA): results are strongly dependent on the deep cleaning of the ingested data and its correct correlation, very complex processes. Every company needs to acquire critical mass to understand the advantages this technology brings, something that is gained in small steps and through experimentation.