Cognitive computing
This article has multiple issues. Please help improve it or discuss these issues on the talk page. (Learn how and when to remove these messages)
|
Cognitive computing refers to technology platforms that, broadly speaking, are based on the scientific disciplines of artificial intelligence and signal processing. These platforms encompass machine learning, reasoning, natural language processing, speech recognition and vision (object recognition), human–computer interaction, dialog and narrative generation, among other technologies.[1][2]
Definition
[edit]At present, there is no widely agreed upon definition for cognitive computing in either academia or industry.[1][3][4]
In general, the term cognitive computing has been used to refer to new hardware and/or software that mimics the functioning of the human brain[5][6][7][8][9] (2004). In this sense, cognitive computing is a new type of computing with the goal of more accurate models of how the human brain/mind senses, reasons, and responds to stimulus. Cognitive computing applications link data analysis and adaptive page displays (AUI) to adjust content for a particular type of audience. As such, cognitive computing hardware and applications strive to be more affective and more influential by design.
The term "cognitive system" also applies to any artificial construct able to perform a cognitive process where a cognitive process is the transformation of data, information, knowledge, or wisdom to a new level in the DIKW Pyramid.[10] While many cognitive systems employ techniques having their origination in artificial intelligence research, cognitive systems, themselves, may not be artificially intelligent. For example, a neural network trained to recognize cancer on an MRI scan may achieve a higher success rate than a human doctor. This system is certainly a cognitive system but is not artificially intelligent.
Cognitive systems may be engineered to feed on dynamic data in real-time, or near real-time,[11] and may draw on multiple sources of information, including both structured and unstructured digital information, as well as sensory inputs (visual, gestural, auditory, or sensor-provided).[12]
Cognitive analytics
[edit]Cognitive computing-branded technology platforms typically specialize in the processing and analysis of large, unstructured datasets.[13]
Applications
[edit]- Education
- Even if cognitive computing can not take the place of teachers, it can still be a heavy driving force in the education of students. Cognitive computing being used in the classroom is applied by essentially having an assistant that is personalized for each individual student. This cognitive assistant can relieve the stress that teachers face while teaching students, while also enhancing the student's learning experience over all.[14] Teachers may not be able to pay each and every student individual attention, this being the place that cognitive computers fill the gap. Some students may need a little more help with a particular subject. For many students, Human interaction between student and teacher can cause anxiety and can be uncomfortable. With the help of Cognitive Computer tutors, students will not have to face their uneasiness and can gain the confidence to learn and do well in the classroom.[15] While a student is in class with their personalized assistant, this assistant can develop various techniques, like creating lesson plans, to tailor and aid the student and their needs.
- Healthcare
- Numerous tech companies are in the process of developing technology that involves cognitive computing that can be used in the medical field. The ability to classify and identify is one of the main goals of these cognitive devices.[16] This trait can be very helpful in the study of identifying carcinogens. This cognitive system that can detect would be able to assist the examiner in interpreting countless numbers of documents in a lesser amount of time than if they did not use Cognitive Computer technology. This technology can also evaluate information about the patient, looking through every medical record in depth, searching for indications that can be the source of their problems.
- Commerce
- Together with Artificial Intelligence, it has been used in warehouse management systems to collect, store, organize and analyze all related supplier data. All these aims at improving efficiency, enabling faster decision-making, monitoring inventory and fraud detection[17]
- Human Cognitive Augmentation
- In situations where humans are using or working collaboratively with cognitive systems, called a human/cog ensemble, results achieved by the ensemble are superior to results obtainable by the human working alone. Therefore, the human is cognitively augmented.[18][19][20] In cases where the human/cog ensemble achieves results at, or superior to, the level of a human expert then the ensemble has achieved synthetic expertise.[21] In a human/cog ensemble, the "cog" is a cognitive system employing virtually any kind of cognitive computing technology.
- Other use cases
- Speech recognition
- Sentiment analysis
- Face detection
- Risk assessment
- Fraud detection
- Behavioral recommendations
Industry work
[edit]Cognitive computing in conjunction with big data and algorithms that comprehend customer needs, can be a major advantage in economic decision making.
The powers of cognitive computing and artificial intelligence hold the potential to affect almost every task that humans are capable of performing. This can negatively affect employment for humans, as there would be no such need for human labor anymore. It would also increase the inequality of wealth; the people at the head of the cognitive computing industry would grow significantly richer, while workers without ongoing, reliable employment would become less well off.[22]
The more industries start to use cognitive computing, the more difficult it will be for humans to compete.[22] Increased use of the technology will also increase the amount of work that AI-driven robots and machines can perform. Only extraordinarily talented, capable and motivated humans would be able to keep up with the machines. The influence of competitive individuals in conjunction with artificial intelligence/cognitive computing with has the potential to change the course of humankind.[23]
See also
[edit]- Automation
- Affective computing
- Analytics
- Artificial intelligence
- Artificial neural network
- Brain computer interface
- Cognitive computer
- Cognitive reasoning
- Cognitive science
- Enterprise cognitive system
- Semantic Web
- Social neuroscience
- Synthetic intelligence
- Usability
- Neuromorphic engineering
- AI accelerator
References
[edit]- ^ a b Kelly III, Dr. John (2015). "Computing, cognition and the future of knowing" (PDF). IBM Research: Cognitive Computing. IBM Corporation. Retrieved February 9, 2016.
- ^ Augmented intelligence, helping humans make smarter decisions. Hewlett Packard Enterprise. http://h20195.www2.hpe.com/V2/GetPDF.aspx/4AA6-4478ENW.pdf Archived April 27, 2016, at the Wayback Machine
- ^ "Cognitive Computing". April 27, 2014. Archived from the original on July 11, 2019. Retrieved April 18, 2016.
- ^ Gutierrez-Garcia, J. Octavio; López-Neri, Emmanuel (November 30, 2015). "Cognitive Computing: A Brief Survey and Open Research Challenges". 2015 3rd International Conference on Applied Computing and Information Technology/2nd International Conference on Computational Science and Intelligence. pp. 328–333. doi:10.1109/ACIT-CSI.2015.64. ISBN 978-1-4673-9642-4. S2CID 15229045.
- ^ Terdiman, Daniel (2014) .IBM's TrueNorth processor mimics the human brain.http://www.cnet.com/news/ibms-truenorth-processor-mimics-the-human-brain/
- ^ Knight, Shawn (2011). IBM unveils cognitive computing chips that mimic human brain TechSpot: August 18, 2011, 12:00 PM
- ^ Hamill, Jasper (2013). Cognitive computing: IBM unveils software for its brain-like SyNAPSE chips The Register: August 8, 2013
- ^ Denning. P.J. (2014). "Surfing Toward the Future". Communications of the ACM. 57 (3): 26–29. doi:10.1145/2566967. S2CID 20681733.
- ^ Dr. Lars Ludwig (2013). Extended Artificial Memory. Toward an integral cognitive theory of memory and technology (pdf) (Thesis). Technical University of Kaiserslautern. Retrieved February 7, 2017.
- ^ Fulbright, Ron (2020). Democratization of Expertise: How Cognitive Systems Will Revolutionize Your Life (1st ed.). Boca Raton, FL: CRC Press. ISBN 978-0367859459.
- ^ Ferrucci, David; Brown, Eric; Chu-Carroll, Jennifer; Fan, James; Gondek, David; Kalyanpur, Aditya A.; Lally, Adam; Murdock, J. William; Nyberg, Eric; Prager, John; Schlaefer, Nico; Welty, Chris (July 28, 2010). "Building Watson: An Overview of the DeepQA Project" (PDF). AI Magazine. 31 (3): 59–79. doi:10.1609/aimag.v31i3.2303. S2CID 1831060. Archived from the original (PDF) on February 28, 2020.
- ^ Deanfelis, Stephen (2014). Will 2014 Be the Year You Fall in Love With cognitive computing? Wired: 2014-04-21
- ^ "Cognitive analytics - The three-minute guide" (PDF). 2014. Retrieved August 18, 2017.
- ^ Sears, Alec (April 14, 2018). "The Role Of Artificial Intelligence In The Classroom". ElearningIndustry. Retrieved April 11, 2019.
- ^ Coccoli, Mauro; Maresca, Paolo; Stanganelli, Lidia (May 21, 2016). "Cognitive computing in education". Journal of e-Learning and Knowledge Society. 12 (2).
- ^ Dobrescu, Edith Mihaela; Dobrescu, Emilian M. (2018). "Artificial Intelligence (Ai) - The Technology That Shapes The World" (PDF). Global Economic Observer. 6 (2): 71–81. ProQuest 2176184267.
- ^ "Smart Procurement Technologies for the Construction Sector". publication.sipmm.edu.sg. October 25, 2021. Retrieved March 2, 2022.
- ^ Fulbright, Ron (2020). Democratization of Expertise: How Cognitive Systems Will Revolutionize Your Life. Boca Raton, FL: CRC Press. ISBN 978-0367859459.
- ^ Fulbright, Ron (2019). "Calculating Cognitive Augmentation – A Case Study". Augmented Cognition. Lecture Notes in Computer Science. Vol. 11580. pp. 533–545. arXiv:2211.06479. doi:10.1007/978-3-030-22419-6_38. ISBN 978-3-030-22418-9. S2CID 195891648.
- ^ Fulbright, Ron (2018). "On Measuring Cognition and Cognitive Augmentation". Human Interface and the Management of Information. Information in Applications and Services. Lecture Notes in Computer Science. Vol. 10905. pp. 494–507. arXiv:2211.06477. doi:10.1007/978-3-319-92046-7_41. ISBN 978-3-319-92045-0. S2CID 51603737.
- ^ Fulbright, Ron (2020). "Synthetic Expertise". Augmented Cognition. Human Cognition and Behavior. Lecture Notes in Computer Science. Vol. 12197. pp. 27–48. arXiv:2212.03244. doi:10.1007/978-3-030-50439-7_3. ISBN 978-3-030-50438-0. S2CID 220519330.
- ^ a b Makridakis, Spyros (June 2017). "The forthcoming Artificial Intelligence (AI) revolution: Its impact on society and firms". Futures. 90: 46–60. doi:10.1016/j.futures.2017.03.006. S2CID 152199271.
- ^ West, Darrell M. (2018). The Future of Work: Robots, AI, and Automation. Brookings Institution Press. ISBN 978-0-8157-3293-8. JSTOR 10.7864/j.ctt1vjqp2g.[page needed]
Further reading
[edit]- Russell, John (February 15, 2016). "Mapping Out a New Role for Cognitive Computing in Science". HPCwire. Retrieved April 21, 2016.