The cognitive function and the framework of the functional hierarchy

Cognitive computing is part of AI and cognitive applications consists of cognitive services, which are building blocks of the cognitive systems. These applications mimic the human brain functions, for example, recognize the speaker, sense the tone of the text. On this paper, we present the similarities of these with human cognitive functions.Weestablishaframeworkwhichgatherscognitivefunctionsintonineintentionalprocessesfromthe substructuresofthehumanbrain.Theframework,underpinshumancognitivefunctions,andcategorizescognitivecomputingfunctionsintothefunctionalhierarchy,throughwhichwepresentthefunctional similaritiesbetweencognitiveserviceandhumancognitivefunctionstoillustratewhatkindoffunctionsarecognitiveinthecomputing.Theresultsfromthecomparisonofthefunctionalhierarchyofcognitivefunctions areconsistentwithcognitivecomputingliterature.Thus,thefunctionalhierarchyallowsustofindthetypeofcognitionandreachthecomparabilitybetweentheapplications.


Introduction
On the computing domain word 'cognitive' has become a common word. Since, cognitive computing approach the cognition and it imitates human cognitive processing, it is essential to explicate correspondences between human cognitive functions and the functions in cognitive computing. Furthermore, when this close similarity will be clarified, then it helps us identify, use and classify their properties and abilities. Therefore, we construct a framework of the functional hierarchy. We have searched for the related researches in Google Scholar and Scopus. Google Scholar search result, 2 hits for the search terms: "cognitive service" AND "cognitive function" AND "mapping". In where the topics of the articles are as follows: (1) Cognition and the web; (2) Pharm.Care@BLED Build -Lead -Engage-Disseminate. Scopus the search result, 0 hits for the search terms: "cognitive service" AND "cognitive function" AND "mapping". Therefore, we present and map the functions of the cognitive services Functional hierarchy 81 (i.e., IBM's Visual Recognition service, Microsoft's Speaker Recognition and IBM's Tone Analyzer) into the human cognitive functions to illustrate the types of functionalities that can be considered cognitive. The research steps and results are presented in the order in which the study proceeds Figure 1. The cognitive computing functions imitates the human mind cognitive functions. Therefore, we present the examples of the human brain cognitive functions (Section 3). Further (Section 4) we construct a framework, which is grounded in the human cognitive functions, to categorize the cognitive computing functions. Thereafter we use these categories to map the cognitive functions (Section 5). In Section 6, we represent the result of the compare the similarities and differences between the cognitive computing functions and the applications, which are not defined a cognitive (NDC) one.

Material and methods
On this paper, we focused on the neural systems functions, which are associated in the brain structures. We used the cognitive functions of 3D Brain mobile application, which illustrate and explain the brain structures with associated functions [1]. The 3D Brain contains descriptions of the whole brain with 28 substructures and the associated functions. At first, Figure 1. Research steps. ACI 16,1/2 we tabulated the 3D brain structures and each associated functions of each structure (see the Appendix A Table 2). Further, we drafted a graph view of this information.
We used '3D Brain' information to construct the graph view of the human brain. First, we illustrated each lobe and their structures in the graph. Second, we added associated functions to each structure and attached the link between them (i.e., the link between structure and function). Third, we construct the links between structures by following the description information. During the construction of the graph, we found that the functions form the chains of processes for performing a task e.g., visual perception. Further some processes such as language processes needed further descriptions. Therefore, we supplemented the structures and associated functions of the brain presentation (see the descriptions of the main processes). Further we grouped the brain functions according to the process, which forms nine main processes. These groups are obtained from the intentional functions of the brain and from the paths through the transition of the sensory stimuli into the cognitive processes and further to the outcome. If the associated cognitive function of brain structure occurred in more than one process, then we listed it in each participant's process. These cognitive groups of functions form a functional interactive hierarchy. Whereat, the associated cognitive functions are grouped adjacent to their respective sensory stimuli (i.e., visual functions, auditory functions, motor functions, sensation functions, homeostasis functions), further the language, and emotion and behavior related multi-processes functions are presented, then memory related functions are collected into memory functions and finally such as higher, executive, complex cognitive functions are in cognitive functions. When we tried to compare human cognitive functions with cognitive service functions, we found out that the extent of the functionality is not the same, therefore we presented the mappings of the similarities between the cognitive functions of the applications (i.e. cognitive services and NDC applications) and the human cognitive functions. Finally, we used the functional hierarchy to compare the similarities and differences between the cognitive functions in cognitive services and the NDC applications.
3. About human brain structures and they underpin to the cognitive functions The brain graph with connected structures and functions visualizes the parts of the brain. From this sketched graph, we grouped the brain functions according to the process by nine main processes: language processes, auditory processes, visual processes, sensation processes, homeostasis processes, motor processes, emotion and behavior processes, memory processes, and cognitive processes. The main groups are formed from the intentional functions of the brain's substructures. For example, language processes functions contain all language related functions such as auditory, speech and written language production and comprehension. These categories are obtained from the intentional functions of the brain and from the paths through the transition of the sensory stimuli into the cognitive processes and further to the outcome. In the next eight paragraphs represents the short description of each of them (auditory processes are described in connection with language processes): 1. Language processes functions is attached in many brain structures. The first cerebral cortex to receive auditory input is the primary auditory cortex which is a substructure of Superior Temporal Gyrus, located at the Temporal lobe. Brodmann areas 42 and 22 interprets the auditory stimuli and associations of auditory [2]. The Superior Temporal Gyrus has functions for information processing of auditory and it uses a Tonotopic map (i.e., the map of sounds and frequencies) and auditory memory [1]. Also, Broca's area attends to language processing, speech production, and comprehension, it is connected with Wernicke's area of temporal lobe via "nerve fibers bundles (i.e. Arcuate Fascilicus)" [3].

Functional hierarchy
Wernicke's area processes, written and spoken language [4] and it involves in language comprehension [1]. Besides the sound processing, Temporal lobe does speech processing and includes Middle Temporal Gyrus which do word retrieval and language processing [1]. Parietal lobes involve in placing lips and tongue in the proper position for the purposes of speak [5] and primary motor cortex initiates the lips and tongue movements [1].
2. Visual processes occipital lobe "receive and interpret visual information", it involves functions such as the reading and comprehension of reading [6]. . Parietal lobes integrate information about different senses, the main functions receive and process sensory information, for example Somatosensory cortex in the parietal lobe participates to identify the location of touch sensation [5]. Parietal lobes have an important role in Somatosensory information perception and integration [1]. As it "contains many distinct reference maps of body, near and distant space, and which are constantly updated" [1]. The limbic system involves sensory perception and sensory interpretation [7], where amygdala gets the sensory signals from thalamus and uses it to process the emotions further hippocampus connects emotions and senses to memories [7] and participate in early memory storage and formation of long term memory [1].
4. Homeostasis processes, the autonomic functions that are controlled by brain structures such as Hypothalamus, Pons and Brainstem. Hypothalamus "is responsive to a variety of stimuli including light (it regulates circadian rhythms), odors (e.g. pheromones), stress, and arousal (hypothalamic neurons release oxytocin directly into the bloodstream)" [1]. Hypothalamus controls functions such as hunger, thirst, body temperature, perspiration, blood pressure, shivering, pupil dilation, circadian rhythms, the sleep and heart rate. Also, brainstem participates in the blood pressure, sleep and the heart rate controlling. The brainstem contains the following brain structures: Pons, Medulla Oblongata, and midbrain. It also controls the functions such as perspiration, digestion, the temperature and regulating breathing. The breathing and the circuits that generate respiratory rhythms functionality are the mostly associated to Pons [1].
5. Emotion and behavior processes: the recent research of Kitamura et al. [8] explains that the Engram cells of the Basolateral amygdala store both positive and negative emotional events and they are important for the communication of emotions linked a memory. Further Bergland [9] explains that "the amygdala acts as a type of emotional relay between the hippocampus and prefrontal cortex". Hypothalamus participates to regulate emotional responses "elicited by sensory information through the release of hormones that act on the pituitary gland in response to stress [7]". Recent research Manninen et al. [10] "Social laughter increased pleasurable sensations and triggered endogenous opioid release in Thalamus, Caudate nucleus, and Anterior Insula". Further the relationship between the opioid receptor density and rate of laugh may indicate that the opioid system underlies our differences in sociability. Our emotions are recognizable in our comportment, they are manifest in the behavioral patterns of facial expressions and in autonomic arousal [11]. Furthermore Dolan [11] explains the global effects on all aspect of cognition, such as emotional influence on perception and attention, and subjective feeling states. Emotional experience in brain regions involves the Insular cortex, Orbitofrontal cortex, and anterior and Posterior Cingulate Cortices [11]. Baley [12] explains that prefrontal cortex in frontal lobes "is responsible for personality expression" and that the frontal lobes contribute in the functions which form our individual personalities. The prefrontal cortex is associated to voluntary behavior e.g., decision-making, planning, problem-solving, and thinking and to personality and emotion by evaluating and controlling appropriate social behavior and inhibition 6. Motor processes, the primary motor cortex is associated to the coordination and initiation of motor movements [1]. The primary motor cortex area is divided into specific body parts, each body parts' cell density and the surface area differs for example the "arm hand motor area occupies the most space in the motor cortex (unsurprising given their importance to human behavior)" [1]. The primary motor cortex is a substructure of the motor cortex; in addition, the motor cortex contains Premotor cortex and supplementary motor area [13]. The Premotor cortex participates in limb movements preparation and execution. To choose right movement it uses information on the other cortical regions. Also, it participates in learning (in the form of imitation) and social cognition functions (in the form of empathy) [1]. Supplementary motor area participates in "selecting movements based on the remembered sequences of movements", the mental rehearsal of bilateral movements, and transformation of kinematic information to dynamic [13]. The thalamus participates in relaying the motor and sensory information between, the cortex, the cortical structures and brain stem. As we also express ourselves in the body language, emotions, for example, appear in our countenance and gestures as well as involuntary and voluntary. "Voluntary movements require the participation of the motor cortex and association cortex", however the association cortex does not belong to the motor areas, it is still necessary for the adaption of the movement and adapt for appropriate to the behavioral context [13].
7. Memory processes: Poo et al. [14] the "synapses are the basic units of information storage". The amygdala is a substructure of the limbic system and it actuates i.e., "what memories are stored and where the memories are stored in the brain" [7]. Dolan [11] describe that amygdala participates in episodic memory encoding, and retrieval of emotional context and items and it has critical the role for fear conditioning which is a form of emotional memory (i.e., implicit memory). Further amygdala attends to learning of conditioned and unconditioned associations [11], for example associative learning such as reward and appetitive learning. Along with the other . The hippocampus is most closely joined brain structure to the memory formation. It is an early storage place for long-term memory and it involves "in the transition of long-term memory to enduring memory" [1]. It also has an important role in spatial navigation [1]. The latest research result of Kitamura et al. [8] also indicates that hippocampus is an early storage place for long-term memory and the memories are initially established at the same time in the hippocampus and the Engram cells of the prefrontal cortex (i.e., specialized neurons, which consolidate long-term memories as time passes). According to the research the prefrontal cortex, hippocampus and Basolateral amygdala participate simultaneously in early memory formation until the memory is consolidated in the prefrontal cortex. Bergland [9] further explains that the technological breakthrough of Tonegawa lab "allowed them to label specific Engram cells in various parts of the brain that contained specific memories" and further this allowed the scientists to trace the brain circuits participating in memory formation, storage and retrieval. Subiculum is associated to function such as memory processing, it is in output area of hippocampus (in temporal lobes) and is important to learning and memory 8. Cognitive processes and cognition, "all cognitive functions result from the integration of many simple processing mechanisms, distributed throughout the brain" [1]. The following brain structures participate in cognition: Baley [12] explains that prefrontal cortex in frontal lobes "is responsible for personality expression and the planning of complex cognitive behaviors" such as reasoning, problem solving. Further she explains that the frontal lobes contribute in the functions such as judgement, and it "help us set and maintain goals, curb negative impulses and form our individual personalities" [12]. The cerebellum represents about 80 percent of the total neurons of the human brain and most of them are granule cells [16], still it has been considered occupying unconscious activities until recently. Bergland [16] presents the novel neuroscience studies of cerebellum [17][18][19][20][21], which indicate that cerebellum and the other subcortical brain structures such as basal ganglia participate in various cognitive processes. The cerebellum in the '3D Brain' [1] is associated with cognitive functions such as the sequence learning and the motor learning and attention. "Pathways from occipital lobes to temporal and parietal lobes are eventually processed consciously" [1]. The parietal lobe [5] processes attentional awareness of the environment [1]. The substructure functions are associated with the perception and integration of Somatosensory information and it participate in manipulating objects and number representation [1]. Middle and Inferior Temporal Gyrus participates in many cognitive processes such as semantic memory processing, language processes, visual perception [1]. The Perirhinal cortex has "important role in object recognition" and it has many connections to others brain structures. These connections "allow it to specialize in associating objects with sensory information and potential consequences" [1]. The substructures of the temporal lobes participate in perception, face and object recognition, emotional reactions, language understanding, learning and memory functions. For example, the Superior Temporal Gyrus and Wernicke's area wherein, "is the major area involved in the comprehension of the language" [1]. Substructures of the limbic system such as Cingulate Gyrus are associated in regulating emotions and processing smells, amygdala is participating in fear processing, emotion processing, learning, the fight or flight response, reward processing. The basal ganglia also contribute in emotional behaviors, reward and reinforcement, habit formation, and addictive behaviors [1]. Amygdala contributes in linking perception with memory and automatic emotional responses [11]. Thalamus participates in relaying information between the brain stem and the cortex also with other cortical structures [1]. This role in cortico-cortical interactions involves many brain processes such as perception, attention, timing, alertness, consciousness, and it contributes to perception and cognition [1]. Frontal Lobe "is the main site of so-called 'higher' cognitive functions" [1]. The substructures of the frontal lobe contribute in thought, decision-making, planning, problem solving (i.e., voluntary behavior), cognition, intelligence, attention, language processing and comprehension, etc.
[1]. Particularly, the prefrontal cortex contributes in higher brain functions. It participates in the executive system such as judgement, reasoning, planning and it also participate in personality and emotion by evaluating and controlling appropriate social behavior [1]. Also, the cerebral cortex' Somatosensory, visual and auditory association areas participate in higher cognitive and emotional processes such as memory, learning, the interpretation of sensations and speech, they are connected, except primary areas, to each other and with Neothalamus [2]. Further three unimodal association areas (limbic, posterior and anterior association area) participate in association and executive processing. They "are adjacent to their respective primary sensory cortical areas", especially association functions, the farther they come from the primary sensory area, more general the functions are [22]. The limbic association area "links emotion with many sensory input and is important in learning and memory", the posterior association area "link information from primary and unimodal sensory areas and it is important in perception and language", and the anterior association area "links information from other association areas and is important in memory, planning and higher-order concept formation" [22]. In addition to the foregoing, Wang et al. [23] lists the following in the higher cognitive processes: recognition, imagery, learning, deduction, induction, explanation, analysis, synthesis, creation, analogy and quantification. Working memory is closely associated with cognitive functions, e.g. the resent study on journal Frontiers in Aging Neuroscience, research present challenging cerebral tasks can improve cognitive functions related to working memory such as complex reasoning, processing speed, abstract thinking [24,25]. Bechara et al. [15] consider working memory to be part of the cognitive functions of the frontal lobe.

Framework to categorize the cognitive functions
According to Wright [22], the brain functions which are localized to the specific region of the brain, have considerable clinical importance, since the localization of the function can explain "why certain syndromes are characteristic of disease in specific brain regions". However, "no part of the brain works in isolation. Each and every part of the brain works in concert with every other part. When a part of the brain is removed, the resulting behavior may reflect more about the adjusted capacities of the remaining "parts" than the removed part" [22]. Accordingly, our interest in this paper corresponds the brain functions and based on the Section 3 descriptions we tabulated the brain functions in nine groups: Language functions,

Functional hierarchy
Auditory functions, Visual functions, Motor functions, Homeostasis functions, Emotion and Behavior functions, Sensations functions, Memory functions, Cognitive functions. These categories are obtained from the intentional functions of the brain and from the paths through the transition of the sensory stimuli into the cognitive processes and further to the outcome. These categorical groups, are composed from bottom to up i.e., from the basic cognitive brain function to higher level cognitive function. As described in Section 3, "all cognitive functions result from the integration of many simple processing mechanisms" [1] and like three unimodal association areas (i.e., limbic, posterior and anterior association area) participate in association and executive processing and they "are adjacent to their respective primary sensory cortical areas", especially association functions, the farther they come from the primary sensory area, more general the functions are [22], hence the functionality could be represented as a slice of the categories (i.e., from top to down (from outcome to input) or bottom to up (from input to outcome) as desired). A slice of the functional categories could be further divided per the cognitive degree (e.g., cognitive function, higher cognitive functions, executive cognitive functions, complex cognitive functions). Further the nature of the cognitive functions (i.e., the result of the integration of many processing mechanisms) leads to a common function involved in multiple processes is listed in each participants' process. For example, facial expressions, empathy, imitation functions are listed in both motor and emotion and behavior processes. The cognitive groups of functions are presented in Figure 2. All processes and functions represented herein are described in Section 3. The processes represented in the dashed line describe unlimited interactions between the processes. The result of a cognitive function is the result of multi-functional cooperation, for example, the functionality of language processes, it is important to co-operate with other processes, such as visual information, auditory information and motor processes. Figure 3 is an example of the functions of language processes, where e.g., we cannot speak well without the movements of lips and tongue.
The cognitive groups of functions form a functional interactive hierarchy Figure 4. The hierarchy structure of cognitive functions underpins the functions of the structures of the brains. These categories are obtained from the intentional functions of the brain and from the paths through the transition of the sensory stimuli into the cognitive processes. The groups of the brain functions are formed according to the process by nine main functional hierarchies: visual functions, auditory functions, motor functions, sensation functions, homeostasis functions, language functions, emotion and behavior functions, memory functions, and cognitive functions.

The comparison of the cognitive functions
We have chosen cognitive service examples for human and cognitive computing functions comparison the chosen cognitive services have online demonstrations whose cognitive functionality can be evaluated by experimenting. McKiney Global Institute analysis, (i.e., Exhibit 4p.37) have for example found the social, cognitive and physical patterns of capabilities, which are often required together to support many activities, hence our chosen functionality examples of cognitive services and NDC applications are examples of these capabilities [26]. The examples are following: IBM Visual Recognition [27], MS Speaker Recognition [28], and IBM Tone Analyzer [29]. Cognitive services will be described along side with the comparisons.
First, we compare the human and computing cognitive functions, in Figure 5 we represent an example of visual recognition in the perspective of the human cognitive functions (upper part of the figure) and below it the cognitive computing functions of IBM Visual Recognition service (lower part of the figure). Further we describe the main similarities from the underlying material basis.
Many human cognitive processes integrate and interpret the information. As described in Section 3. The human cognitive functions are interactive and simultaneous. The required Functional hierarchy human cognitive functions largely depend on the content of the image, for example, if the image depicts the IKEA furniture assembling guide, then normally the number of required cognitive functions is greater than if the image is a football. In this example ( Figure 5) selected cognitive functions are for multiple images. For example, human can detect feeling states and action from the image such as facial expressions, the image of the psychological test human need to use imagination to recognize the image, or in images which contains text we need language processes functions such as the comprehension of reading to detect the message. Whereas the IBM Visual Recognition service analyzes the images for faces, objects with the deep learning algorithms [30]. The deep learning algorithms are the type of backward chaining neural nets that "uses learning algorithms to progressively infer a pattern about a body of data by starting with the goal and then determining the rules that are inferred, which can then be used to reach other goals" [31]. The service can identify food, and colors, categorize and tag the image, detect the face, age, gender, and for the celebrities it returns the identity, knowledge graph and name. Further the service can be trained to create custom classifiers [32]. The results are represented with a score (i.e., score range from 0 to 1 higher score equal greater correlation). For example, the image of IKEA's [33] the assembling guide gives an outcome ( Figure 6 on the left side) a 'study' class with a score 1.0 and colors gray (0.87) and olive green (0.86).  We tested the service also with an image of the same product ( Figure 6 in the middle), the image of the product is classified in a loudspeaker, electrical device and device classes with a score 0.69. The second-best scores are in 'a portfolio and a file folder' classes (0.60). Furthermore, service results in the types of hierarchy: /electrical device/loudspeaker, /recorder/black box, /electrical device/loudspeaker/subwoofer. The example of the face image ( Figure 6 on the right side, [34]) is classified as a person (0.73), President of the United States (0.55), reddish orange color (0.67). Further service identifies male (1.00) face age between 55 and 64 (0.56). To improve the result, the service needs to have a customized classifier and it needs training (the positive images and negative images of the subject).
We realized that the extent of the functionality of visual recognition of human and cognitive computing is not the same, therefore we chose to compare cognitive computing with human cognitive functions. The same manner (i.e., compare computing cognitive versus human function), the comparison of cognitive functions of cognitive services will be presented in Section 5.1 and comparison of the cognitive functions of NDC applications in Section 5.2.

The cognitive functions in cognitive services versus human functions
In the following, we illustrate the main functionality of each cognitive service alongside to the similar functions of the human as follow: IBM Visual Recognition, MS Speaker Recognition, and IBM Tone Analyzer. The functional similarities are presented in order: first the computing and then in the brackets [ ] the similar functionality in human cognitive functions.
5.1.1 Visual recognition. The following describes the similarities between IBM Visual Recognition and human cognitive functions (Figure 7). The service receives the image either through the link or drop image function, and processes it [receive sensory information; processing sensory information]. The neural network uses the inferred rules to categorize the image (i.e., the "learning algorithms to progressively infer a pattern about a body of data by starting with the goal and then determining the rules that are inferred, which can then be used to reach other goals" [30]. The rules for the service have been inferred during the training phase (i.e., when it has been trained to recognize certain type of images). In the training phase, neural net uses the training sets of images which it processes to infer the patterns and to form the rules [visual memory, memory retrieval, working memory, memory processing, memory acquisition (i.e., the acquisition of the rules), learning]. The acting service categorizes the inputted image  [35]. Speaker verification underpins on the unique characteristics of the voice (i.e., unique voice signature), which is used to identify a person. Further 'speaker identification' compares the audio input (i.e., voice) with the provided group of speakers' voices and if the match of the voice is found then the speaker's identity is returned. Finding the match of the voice requires a pre-registration of the voice signature of speaker.
The service (Figure 8) receives the audio input through a microphone [receive sensory information]. The process is divided into two parts, the first of which does speaker verification and the second is speaker identification. Speaker verification underpins on the unique characteristics of the voice (i.e., unique voice signature), which is further used to identify a person. The unique voice signature creation (i.e., enrollment) precedes the successful speaker recognition. In the voice signature enrollment part, the system extracts the predefined features from the audio example and creates the model [36,35] [Processing sensory information, Sensory interpretation, Interpretation of the audio stimuli, Auditory information processing, Sound processing, Speech processing, Language processing, Tonotopic map functions, Processing combination of frequencies, the Processing changes in amplitude or frequencies, Personality expression, Analysis, Synthesis, Creation, the learning of unconditioned associations, Association, Learning, Auditory map, Auditory memory, Working memory, Preprocessing the memorable information, Memory processing, Memory functions, Memory acquisition, Memorizing, Connecting senses to memories, Choosing the memories to stored,  In the verification, Speaker Recognition service extracts the features from the audio input, [Receive sensory information, Processing sensory information, Sensory interpretation, Interpretation of the audio stimuli, Auditory information processing, Sound processing, Tonotopic map functions, processing the combination of frequencies, processing changes in amplitude or frequencies, Speech processing, Language processing, Personality expression, Analysis, Memory processing, Memory retrieval, Working memory, Auditory memory ]. In the speaker identification part, extracted features are matched against the model [Analogy, Decision making, Recognition]. The similarities in the functional hierarchy and principal basic functions are in the auditory information, language, and emotion and behavior functions hierarchy.
5.1.3 Tone analyzer. IBM Tone Analyzer service uses unsupervised learning algorithm Glove to detect social tones from textual input. The Tone Analyzer extracts the tones of the text as an emotion such as anger, disgust, fear, joy, sadness; language style such as analytical, confident, tentative; and social tendencies such as openness, consciousness, extraversion, agreeableness, emotional range [37].
IBM Tone Analyzer service ( Figure 9) detects tones from textual input. It uses unsupervised learning algorithm Glove to detect for social tones (i.e., it tokenizes the text and obtains the vector representation from the words of the text and further processes it in purpose to generate the meanings) [38] more in detail, see [39]. [Receive sensory information, Processing sensory information, Word retrieval, Written language processing, Language processes, Language processing, Information processing, Reading functions, Semantic memory processing, Learning of conditioned associations, Learning, the comprehension of complex syntax, the comprehension of reading functions, Language understanding, Language memory processing, Executive processing, Association, Recognition, Abstract thinking, Deduction, Analysis, Analogy, Quantification, Cognition, Memory, Memory functions, Memory retrieval, Memory processing, Working memory, Choosing the memories to stored, Choosing the place where memories are stored, Preprocessing the memorable information, Memory formation, Memory storage, Memory acquisition, Memorizing, New memory formation, Number representation].
The Glove model, can be used to output the analogy of the words, and the scoring of it can further be compared with the human judgments [38]. The Tone Analyzer processes the results of Glove in machine learning algorithms, in which it extracts the Big Five (i.e., personality model, which higher level dimensions are: openness, conscientiousness, extraversion, agreeableness, emotional range), Needs and values characteristics. The service uses the model, which is trained against ground truth data (i.e., corpora) [Learning]. The Tone Analyzer extracts the tones of the text as follow [37]: The service returns each tone and a score, where 0.5 or less indicates in the emotion and language style analysis that tone is 'unlike to be detected' in the contents, where greater than 0.75 indicates 'higher like hood' [Number representation, Quantification]. The social tendencies describe the content with a score less than 0.5 are more likely to be perceived. Agreeableness tone for example: is selfish and higher than 0.75 is caring (in more detail see: [40]). The similarities in the functional hierarchy and the principal basic functionalities are in the visual information, language, and emotion and behavior functions hierarchy.

The NDC application versus human functions
The NDC application is an application that is not defined as cognitive, which is a programmable system that has rules and predetermined processes for producing results Functional hierarchy (i.e. the outcome of a non-cognitive platform) [41]. The focus of this study is to describe, what types of computing functions could be considered cognitive, therefore we the first compare the NDC application with human cognitive functions and thereafter the analyze the differences and similarities between the NDC and cognitive application (Section 7). We have chosen the NDC applications which have available online descriptions. The chosen applications are: Apple Watch Activity app [42] and Apple iPhone Parked Car service [43]. The functional similarities are explained, in the following order: first the computing functionality and then the similar functionality in human cognitive functions are listed in the brackets [ ]. 5.2.2 Parked Car. Apple iPhone Parked Car service works with Maps. In the example (Figure 11) of parked car service, only car parking information part is presented, even if the service also performs other functions such as make reservations request for Uber.
The car parking functions, searches the current location of the car (i.e., latitude and longitude) when iPhone disconnects from CarPlay or Bluetooth, it shows the message to a  Functional hierarchy user and saves the location in the database. Further it shows the location for the user in the Map application. When needed, the Map application gives the directions which guide the user to the car [43]. [Receive sensory information, Working memory, Processing sensory information, Sensory interpretation, Integrate information (Where the things are), Integrate information (What the things are), Preprocessing the memorable information, Information processing, Association, Higher-order concept formation, Memory formation, Memory storage, Choosing the memories to stored, Choosing the place where memories are stored, Spatial mapping, Memorizing, New memory formation, Memory, Memory processing, Memory functions, Memory retrieval, Reference map functions of near and distance space, Executive processing, Spatial navigation.] The similarities in the functional hierarchy and the principal functionalities are in the memory, cognitive, sensations, and visual information functions hierarchy.

Results and discussion
In the following, we analyze the differences and similarities between the NDC application and cognitive service results. Functionalities of applications are not directly comparable to each other, because they perform different tasks. Still, we can compare their cognitive functionalities in a scale of human cognitive functions. Table 1  Cognitive applications use about 70% of the example of the hierarchy of cognitive functions at levels I, III and IV, compared with NDC-applications, which uses about 30%. In contrast at level II (i.e., language -and Emotional and Behavior functions) there is no NDC-applications.
The content analysis of cognitive functions descriptions reveals the differences in functional types. Cognitive services use machine learning algorithms for data processing that reveals a difference between implementation techniques in these applications. Furthermore, the cognitive services in cognitive platforms are implemented in the neural network which imitates the human nervous system. We illustrate these differences in the Appendix B (see the Appendix C Figures 13 and 14 and Tables 3 and 4 Comparison of cognitive function descriptions reveals differences between functionalities in Cognitive -and NDC applications (i.e., the functionalities which are included in Cognitive applications but they are missing in NDC applications). Examples of differences, as follows: Language functions (e.g. language understanding); Emotion and behavior functions (e.g., Emotion processing), Memory functions (e.g., Learning), cognitive functions hierarchies (e.g., Occipital Lobe, etc.) All cognitive functions result from the integration of many simple processing mechanisms, distributed throughout the brain. The outer layer of the forebrain constitutes the familiar wrinkled tissue that is the cerebral cortex, or cortex. The large folds in the cortex are called gyri. The small creases within these folds are fissures. Each hemisphere of the cortex consists of four lobesfrontal, parietal, temporal and occipital. Other important structures are the brainstem, cerebellum the limbic system (which includes the amygdala and hippocampus. Associated with damage: it is possible for the brain to repair damaged neural networks or to compensate for the loss of function in particular structures. Common impairments resultants from brain damage include deficits in attention, emotion, language, learning, memory, movement, perception and sensation Arousal, emotion, language, learning, memory, movement, perception, sensation, thinking, many others

Amygdala
The amygdala has three functionally distinct parts-(1) the medial group of subnuclei has many connections with the olfactory bulb and olfactory  Hippocampus Is an early storage place for long-term memory and transition from there to more enduring permanent memory; is the structure in the brain most closely aligned to memory formation. It is important as an early storage place for long term memory, and it is involved in the transition of long term memory to even more enduring permanent memory, also plays an important role in spatial navigation Early memory storage, formation of longterm memory, spatial navigation

Hypothalamus
Regulates a wide range of behavioral and physiological activities. It controls many autonomic functions such as hunger, thirst, body temperature, and sexual activity. To do this, it integrates information from many different parts of the brain and is responsive to a variety of stimuli including light (it regulates circadian rhythms), odors (e.g. pheromones), stress, and arousal (hypothalamic neurons release oxytocin directly into the bloodstream). Other functions controlled by it include parenting behavior, perspiration, blood pressure and heart rate Hunger, thirst, body temperature, sexual activity, arousal, parenting, perspiration, blood pressure, heart rate, shivering, pupil dilation, circadian rhythms, sleep (continued ) Connections allow it to specialize in associating objects with sensory information and potential consequences (e.g., reward) Object recognition, memory formation and storage Pons Is the region in the brain most closely associated with breathing and with circuits that generate respiratory rhythms. It forms bridge between the cerebrum and cerebellum and is involved in motor control, posture, and balance. Involved in sensory analysis and it is the site at which auditory information enters the brain.
Regulating breathing, taste and autonomic functions (continued ) Interpretation of sensations, Recognition). These differences between the two types of applications are the results of the functional hierarchy of cognitive functions and consistent with cognitive computational literature such as [44][45][46][47].

Conclusion
The focus of this study is to describe, what types of computing functions could be considered cognitive. This paper is neither intended nor possible to present all cognitive functions of the human brain, but rather it is intended to open an idea of cognitive computing functions and their similarity with human, by examples. To set the cognitive computing functions in the hierarchy of functions allow us to find the type of cognition and reach the comparability between the applications. On this paper, we have constructed the framework to categorize the cognitive computing functions. The hierarchy structure of cognitive functions underpins the functions of the brain's structures. These categories are obtained from the intentional functions of the brain and from the paths through the transition of the sensory stimuli into the cognitive processes. To characterize the cognitive computing functions, we have chosen cognitive service examples which have online demonstrations. Further we describe their functionality and map the cognitive service example functionalities to the brain functions and present their similarities. We use the framework, which classify and hierarchize the cognitive service functions. To compare, we mapped NDC application functionalities to the brain functions and present their similarities. Finally, we summarize the differences and similarities between the cognitive computing and NDC function. Both types of examples create the results of cognitive functions and their differences are attached in the cognitive functions of functional hierarchy categories in Language, emotional and behavioral functions, memory functions (e.g., learning), and Cognitive function (e.g. Interpretation of sensations) hierarchies. The sample material is quantitatively low, so the proposal for the enhancement of the framework is to implement it by machine learning, in which cognitive functions would gain weights (e.g., in each categorical level) and it enable to refine descriptions of functions (i.e., quantitatively more samples), which does the enhancing of the comparability.
Cognitive calculation functions produce cognition results that are produced, for example, in a neural network, which mimics the human nervous system. Although the infrastructure achieves similarity with human cognitive functions, we also need cognitive algorithms such as Glove, which mimics human cognitive functions. The combination of these features goes beyond human abilities (i.e., quantity, speed, and variety), such as data retrieval and the ability to combine data from different sources to generate knowledge.
Compliance with ethical standards Funding: No funding. Ethical approval: This article does not contain any studies with human participants or animals performed by any of the authors.
The similar human cognitive functions of the cognitive functions of NDC applications are listed Table 4. The functions divide into three cognitive function hierarchies ( Figure 14): Memory functions (41%); Visual, Auditory, Motor, Sensation and Homeostasis functions (31%); Cognitive functions (28%). Language -and Emotion and behavior functions (0%) (i.e., level II) of hierarchical cognitive functions are not in use.

Service
The cognitive functions

IBM Visual recognition
Learning, the higher order concept formation, visual perception, interpretation of visual information, recognition; object and face recognition; spatial mapping MS Speaker Recognition Table 3. The cognitive services and cognitive functions of them. Functional hierarchy Table 4.
The NDC applications and cognitive functions of them. Figure 14.
The NDC application's functions by cognitive functions functional hierarchy.