Skip to main content
European Commission logo
English English
CORDIS - EU research results
CORDIS
CORDIS Web 30th anniversary CORDIS Web 30th anniversary

software framework for runtime-Adaptive and secure deep Learning On Heterogeneous Architectures

Periodic Reporting for period 2 - ALOHA (software framework for runtime-Adaptive and secure deep Learning On Heterogeneous Architectures)

Reporting period: 2019-07-01 to 2021-06-30

Deep Learning, an extremely promising instrument in the machine learning and artificial intelligence landscape. DL algorithms allow achieving very high performance in numerous applications involving recognition, identification and/or classification tasks, however, their adoption, and in general of AI technologies, is hindered by the lack of availability of low-cost and energy-efficient solutions.
Novel algorithm configurations, exploited in different domains, continuously improve the precision of DL systems. However, such advancement comes at the price of significant requirements in terms of processing power. Moreover, while the training phase is typically executed on high-performance computing facilities, recent trends of modern computing landscape push towards an ever-increasing deployment of DL inference on embedded devices. Using such an approach, according to the edge computing paradigm, DL systems may overcome limitations of cloud-based computing, when it comes to latency, bandwidth requirements, security, privacy, and availability. Nevertheless, when DL is moved at the edge, severe performance requirements must coexist with tight constraints in terms of power and energy consumption.
The ALOHA project has created a toolflow that facilitates the implementation of DL algorithms on heterogeneous low energy computing platforms. On the basis of input information such as problem definition, application constraints and description of the target processing architecture, ALOHA provides automation for key design flow stages, such as optimal algorithm selection, resource allocation and deployment.
In the ALOHA project the tool flow is associated to three use cases, that have been used to assess the capabilities of the toolflow. For each use case a demonstrator has been built and assessed within the project.
The project has defined and created a set of utilities, integrated together in one single toolflow, that automates the creation and deployment of custom-tailored CNN algorithms, optimized so that their inference is executable at the edge on an embedded processing platforms. The tools have been connected together with REST APIs and the execution can be controlled by an adequate graphical user interface inspired by the agile development principles.
The tool is available as open source, the packages are available at the address: https://gitlab.com/aloha.eu/
In this way the tool can be exploited freely by potential users, and can be considered a key enabling instrument to foster the adoption of Deep Learning in new industry and academic projects. The tool is also exploited internally by the consortium. The development of the three reference use-case is the baseline for the creation of new products to be added to use-case providers’ portfolios. Second, tool developing companies in the consortium will support their own tool with an integrated ecosystem of other utilities that will increase their potential for customers. In general all the software and hardware companies in the consortium will improve their time to market using ALOHA in new projects to come and will be capable of providing new support to their customers.
The project results have been disseminated in multiple events for the computer science, processing architecture and embedded systems community. This has enabled the construction of an incipient user-community that is providing feedback for further improvements and new features.
The impact goal of the project, as stated in the DoA, is to “Reinforce and broaden Europe's strong position in low-energy computing by reducing the effort needed to include digital technology inside any type of product or service, including outside the traditional “high-tech” sectors.” To address this target ALOHA intends “To study and provide methodologies and computer aided design support for effective implementation of Deep Learning algorithms on embedded systems, considering their prospective implementation on low-power computing platforms, helping the developer in all aspects, from application definition to deployment”
The evaluation of the main key performance indicators defined to evaluate the project, assessed on the reference use-case, has demonstrated that the toolflow can reduce to days (or even to hours for simpler use-cases) the development and deployment time for CNN algorithms on embedded platforms. Such a process requires weeks or months of effort when performed manually, and requires usually very advanced skills that are hard to acquire, especially for small and medium actors on the market. ALOHA, thus, paves the way to ubiquitous adoption of CNNs. Moreover, the toolflow considers advanced aspects of the implementation, such as workload reduction, security evaluation and improvement, and adaptive management, which are not considered simultaneously by similar utilities in literature.
ALOHA project LOGO