VIVACE is another CCEx project in the field of artificial intelligence and user interface.
Here is what the acronym stands for:
-Virtual Intelligence: An artificial intelligence's interface meant to receive input and supply information respectively from and to the user.
-Virtue: The virtue of a virtual intelligence is its ability to process information faster, better and with more potentialities than a human being on his own.
-Assisted Computer Experience: The VI (virtual intelligence) drastically enhances the computer user's experience (productivity, usability, information accessibility, eye candy, etc...).
But the better part about this project is that VIVACE is meant to use pre-existing hardware such as current or averagely old computers and their peripherals to achieve its goals.
Here is an update on the VIVACE project, there should be more progress on it soon:
VIVACE's GUI or Graphical User Interface’s appearance would be of glass (as in matter), transparent and superposed to the OS’ (Operating System) GUI or Desktop Environment, and it would use all available captors, sensors and peripherals such as webcams, microphones, temperature sensors, keyboards, mice, Fan speed sensors and other possible PC or Mac extensions to get an awareness of the user’s environment both inside the Operating System and exterior to it in the physical world to help the user in his daily tasks.
Internet would be used too to get information about the location and other parameters of the user in order to improve his experience, meteorological information could then be displayed to him, as well as paths to his destinations when traveling or moving away from the computer, like what is already available in such Personal Assistants as Siri on the iOS and Cortana on Windows for example.
The Webcam for instance could be used for facial recognition to unlock a previously locked OS session after a period of inactivity or if VIVACE detects the user has left the computer via the webcam or microphone (if VIVACE doesn’t sense any sounds anymore).
The application would come free and open source for Linux, and commercialized for Windows and MacOS for a fixed fee that should be decided later.
The project will use the CCEx company’s internal funding as well as some form of crowdfunding.
As a base the ALICE AIML language would be used for the project, then extensions would be used on top of it to further customize the AI to better suit VIVACE's needs, Python would be used too, as VIVACE like leet will both be built using this programming language.
Portions of C could be used to extend Python for performance reasons if need be.
The AIML language and ALICE's involvement in the project are due to dictionaries being already available permitting the AI to hold as near as meaningful possible discussions with the user, which is a planned feature for VIVACE...
After discussing with our team, we decided that the transparent glass VIVACE GUI should use QT as a development framework.
We also decided that the project shall use as few dependencies as possible for the main program.
We shall also possibly use "lm-sensors/sensors-detect" for querying hardware status.
Also V4L2 for access to webcam when available, possibly even use biometric face detection to unlock functionalities/desktop session.
Use audio capture in conjunction with video (webcam) and mouse activity to detect whether the user is close to the computer and still willing to use it, to determine if session should be locked.
Use internet connection to retrieve various data from the net, such as weather, location, calendars, mails, contacts (Phone (Android, iOS)) or other directories of contacts, maps and itineraries for daily planning.
Include a daily task manager, with potentially ML prediction suggestions for daily tasks and planning.
Make all the above features optional and possible to disable/enable individually.
To be continued...