Skip to main content
European Commission logo
Deutsch Deutsch
CORDIS - Forschungsergebnisse der EU
CORDIS
CORDIS Web 30th anniversary CORDIS Web 30th anniversary

Perceptive Sentinel – BIG DATA knowledge extraction and re-creation platform

Leistungen

Dissemination

In order to raise awareness about the project and promote PerceptiveSentinel platform and its added value, specific tasks will be devoted to promotional activities - each of partners contributing in its own way and addressing their part of user community. AIS will publish a minimum of three scientific papers in international IF peer–review journals, present project’s results at least at two international scientific conferences. AIS will further organise national conference for agricultural experts and the Agricultural Extension Service. Furthermore, project will be presented on regular AIS’s events where latest results of research are presented to end user community (Open field day, Day of wheat, Berry days, Winegrowing and wine producers meeting etc.). SINERGISE, GEOVILLE and MAGELLIUM plan to attend several EO conferences (such as Living Planet Symposium, regional land administration events) and MARS (Monitoring Agricultural Resources) Annual Conference, with participants from European Commission, Joint Research Centre, representatives of Ministries of Agriculture of EU and software developers. The events will be used to disseminate the idea within the Governmental sector as there are participants from practically all member states and candidate countries. Further targeted event is also Earth Observation Open Science 2.0 - scientific conference organised by ESA, perfect for sharing ideas about possible products based on EO data. SINERGISE will exploit high visibility of Sentinel Hub (later also by EO-Toolset), currently reaching more than 1.000.000 data request per week, to engage with interested public and inform them about results of PerceptiveSentinel project. L&F will disseminate the projects results both domestically and abroad. SEGES (L&F’s precision farming R&D department) will focus on the dissemination of project’s results to end users. Hence, SEGES will focus on teaching farmers and farm advisors to use the new capabilities that the EO platform offers. This will be done through presentations and demonstrations at conferences, fairs and through courses for farm advisors. Nearly all Danish farmers have an affiliated crop productions advisor, and thus training of advisors is an effective strategy for implementing new technology in Danish farming, and a strategy for which SEGES have a proven track record of success. - L&F SEGES will host at least one training course for farm advisors where the capabilities in the SEGES variable rate application software will be demonstrated. - At least two articles regarding the project will be published at the agricultural information portal “Landbrugsinfo.dk”. - Project results will be presented at the L&F SEGES plant production congress (approximately 2000 participants which are farmers, advisors, agricultural scientists or government civil servants) and at the international conference for precision agriculture JSI will present PerceptiveSentinel’s result on dedicated machine learning conferences and through publishing of scientific articles. Centre for Knowledge Transfer in Information Technologies, employing ten researchers and technical staff working in the areas of research results dissemination and eLearning, will participate in the project to provide dissemination push to reach wide scientific public (publishing articles on PerceptiveSentinel project). The centre is well known through its portal http://videolectures.net/ with multimedia materials of numerous scientific events, on-line training materials, and a collection of tutorials on different scientific fields.

Communication

During the project, communication activities will take place in a form of dedicated activities tailored to target specific audiences (INITIAL PerceptiveSentinel community): (1) intermediate users and (2) specific end user communities: farmers and farming associations (Precision farming products), governments (CAP control and food security), insurance companies (risk assessment, damage assessment and index products) and forestry agencies (illegal logging and forest inventory). The goal of communication activates is to promote project, to attract potential intermediate and end users and to establish circumstances for successful market entry. Two type of workshops will be organised with agriculture champion user group by L&F (Danish farming community), GEOVILLE (Austria) and AIS (Slovenia): (1) assessment on user requirements and (2) educational workshops to present use case EO VAS. L&F will host at least one training course for farm advisors where the capabilities EO VAS use cases and their integration into Crop management solutions will be demonstrated. Hands-on trainings with intermediate users for the construction, publishing and integration of EO VAS will be organised to encourage construction and publishing of new servieces and to ecourage the usage of PerceptiveSentinel pre-processing capabilities. Direct interaction (mail, visits, presentations, one-one-on communication) with larger end-users will be organised to acquire snapshot of user-expectations and to promote capabilities of ParceptiveSentinel platform.

Heterogeneous Data Pre-Processing for Stream Mining

JSI will be supported by RS experts (GEOVILLE and MAGELLIUM) and agriculture experts (AIS, L&F) to deliver deep data understanding, required for the development of coherent time-features. One of the main PerceptiveSentinel focuses will be to use biophysical indices as main input to develop TIME features. We intend to use different (streaming) aggregates, such as moving or exponential moving average, minimum, maximum, histogram, correlation, variance, sum and others. Streaming aggregates can be applied for different time-windows. Spatial, radiometric and time-series features of EO data will be supplemented by domain specific “agriculture” features to form a feature set – a list of attributes to enter the evaluation through data mining modelling. Streamline modelling and experimenting will be performed by the employment of standard JSI tools to evaluate learning feature set. Feature extractors will be developed by JSI. Their functionality is to receive Stage-1, Stage-2 and non-EO data from PerceptiveSentinel platform and to automatically extract features to be used in learning and operational phases of streaming processing.

EO-data collection

This task comprises of collecting EO time series data (SENTINEL-1, SENTINEL-2, SENTINEL-3, LANDSAT, ENVISAT, MODIS, Planet and RapidEye) to support development, validation and demonstration activities. Time series will be collected: (1) historical data, where available in a time span supported by non-EO and in-situ data, (2) up-to date time-series as they are available.

BIG-DATA infrastructure

A combination of usage of own and leased infrsatructure is foreseen to provide PerceptiveSentinels processing, storage and communication infrastructure.

Initial PerceptiveSentinel Community

All of the consortium dissemination and communication activities will lead towards establishment of Initial PerceptiveSentinel community, which will provide a push, required to successfully enter the market in 24 months after the project completion. Many potential members were approached already during project preparation (see letters of support in Annex 2) and many more will be approached through project execution.

Data verification

EO and non-EO data will be verified using multiband and hyperspectral imaging. AIS has already started with microplot field experiments, where all microplots are regularly scanned using airborne and tractor borne hyperspectral sensors. These spectral signatures will be for verification purposes calibrated to top-of-atmosphere reflectance, adjusted to EO spectral bands, and up-scaled to satellite spatial resolutions. Thus, spatially and temporarily precise data will enable verification of EO data. For ground truth and fast measurements, a field spectrometer will be used. L&F will acquire imagery, using drones equipped with multiband cameras, to provide verification data for the algorithms. Access to drone and camera infrastructure will be provided free of charge to the project by the operator on the Danish field trials (Danish Technological Institute) or through using its own equipment (when available; still in purchase process).

EO VAS: Moisture CONTENT

PerceptiveSentinel platform services will be employed to design and publish EO-VAS: Moisture CONTENT. EO VAS: Moisture CONTENT will be produced, based on combination of EO and non-EO feature time-series and radar imagery (SENTINEL-1). The growth and yield of crops are driven by the amount of soil moisture available to the crop through rainfall and irrigation. Moisture CONTENT service will provide knowledge to support 3rd party services like irrigation scheduling, pest management, soil erosion and infiltration and Crop YIELD. We plan to approach the problem through TIME-FEATURES processing of S1 data, accompanied with crop TYPE and crop CYCLE, complemented with other non-EO data and in-situ moisture measurements. Development will run in parallel in close cooperation with services where service chaining is expected. Activities will be supported by SINERGISE and JSI, the first offering support in platform usage and the second providing support on the usage of streaming learning methods.

EO-QMiner: Stream Mining Models for Earth Observation

Several methods will be analysed during evaluation of streaming learning models (see chapter: Streaming machine learning in Part B). Activities will result in a set of learning models to be incorporated into EO-QMiner. Integration between PerceptiveSentinel platform and EO-QMiner is essential integrative part of the platform, enabling: - data exchange in both ways (platform providing learning/interpretation data, EO-QMiner providing interpreted data) - workflow control of EO-QMiner (by platform) - administrative control of EO-QMiner (by platform) EO-QMiner integration layer will provide JSI's part of integration capabilities. Code from JSI’s open-source repository QMiner will be used to construct EO-QMiner. Certain level of new development is envisaged in the areas: - adaptation to streaming processing and - incorporation of new learning technologies. Integration and functionality testing will be performed by JSI to eliminate bugs and validate integration into PerceptiveSentinel platform.

EO VAS: Cultivated AREA

Cultivated AREA is a product that will establish mask for cultivated areas, that will be later used as an input for Crop YIELD estimation (therefore yield is estimated only for cultivated areas!). Cultivated areas should not be mistaken for field parcels (declared unit of cultivated land), since declared and actual situation may frequently differ. In our project, field parcels are not considered as good approximation since one parcel can be cultivated only partially, or it can be cultivated with multiple crop types or it could be left bare. Cultivated area mask will be developed for a specific scenario, using only data from initial growth period, thus enabling early yield predictions. This specific scenario requirements will demand orientation into non-temporal features. Cultivated AREA USE CASE will be based on deep learning algorithms (MAGELLIUM). Product will be constructed using deep learning algorithms integrated in the PerceptiveSentinel’s processing library. Cultivated areas detection is required early in the crop growing period and the service would not be able to use long time series. Depending on the requirements this service will provide a pixel-wise classification (cultivated/no-cultivated), a binary mask or a vector.

PerceptiveSentinel platform

All software components will be integrated into one system: PerceptiveSentinel platform. Comprehensive testing, first integration and later functional testing will be performed to test all functionalities. system roll-out will enable PerceptiveSentinel platform to run in DEMO operational mode, enabling validation and demonstration activities to begin. Users will receive all support and help, required to use the system and design processing chains. A bug reporting system will be established in order to capture and record all software issues. Hot fixes will be provided in emergence of system critical failures.

Demonstration data set

Demonstration data sets will be collected for a selected DEMO area(s) where all of the functionalities and capabilities of PerceptiveSentinel (platform and products) can be accessed and investigated.

EO modelling data (biophysical indices)

"In order to facilitate WP4 activities, the time-series data will have to be pre-processed ""manually"" and level-2B products prepared for domain experts before PerceptiveSentinel platform is operational."

Integration services

Integration services are going to enable external CHAINING of PerceptiveSentinel services. Development of integration services will be a parallel project, spanning through all platform development iterations – strict standard constraints imposed already in system design phase. Following Integration services to support chaining into 3rd party front-offices are foreseen: - DATA integration services; WMS, WMTS and WFS data-source services, based on OGC standards will be provided in a form of “Configurable data-service” to enable integration with (1) off-the-shelf GIS tools such as qGIS, ArcGIS, GeoServer or with (2) existing proprietary applications, such as Land Parcel Identification System (LPIS), farm management tools, etc. - Open-Source integration; Tiled web-map services will be provided for easy inclusion in open-source frameworks such as OpenLayers, Leaflet and others. - Application integration; REST (representational state transfer) APIs will be provided for (1) integration of advanced functionality, such as time-lapse videos, statistical query results, change detection products, etc. and for (2) integration with 3rd party products. - WEB-SITE integration; PerceptiveSentinel map widgets will be provided to enable integration into 3rd party web-sites. - Notification services will be provided to enable integration with 3rd party workflow engines.

Spatial and radiometric features

Spatial and radiometric features, useful for data mining and high level information extraction, will be defined and developed. They Will be constructed using: - radiometric characteristics - spectral characteristics - spatial or local parameters such as texture values, shape and geometric descriptors (lines detection, polygons, specific textures for cultivated fields, extraction of characteristics points – SIFT/SURF) etc. Features will be extracted from single images from the EO data collection (SENTINEL-1, SENTINEL-2, SENTINEL-3, LANDSAT, ENVISAT, MODIS, Planet and RapidEye) but if necessary, depending on the mining algorithms, they can be computed several sources of data.

EO VAS: Crop CYCLE

PerceptiveSentinel platform services will be employed to design and publish EO-VAS: Crop CYCLE. EO VAS: Crop CYCLE will describe the evolution of the certain crop (using Crop TYPE as an input) corresponding to the vegetative development of the crop. Specific stages of growth (e.g. flowering, grain filling) are particularly sensitive to weather conditions and critical for final yield. The timing of the crop cycle (phenology) determines the productive success of the crop. In general, a longer crop cycle is strongly correlated with higher yields, since a longer cycle permits maximum use of the available thermal energy, solar radiation and water resources. Crop cycle info will be an important input into Crop YIELD estimation (example: water requirements vary significantly during growth cycles). Development will run in parallel in close cooperation with services where service chaining is expected. Activities will be supported by SINERGISE and JSI, the first offering support in platform usage and the second providing support on the usage of streaming learning methods.

EO VAS: Crop DAMAGE

PerceptiveSentinel platform services will be employed to design and publish EO-VAS: Crop DAMAGE. Crop DAMAGE will be used for regular assessment of damages caused by hail, drought, floods and wind. Crop TYPE and Crop CYCLE products will be chained to deliver Crop DAMAGE, since the impacts of unfavourable meteorological conditions and extreme events vary considerably, depending on the timing of occurrence and the development stage of the crops. It is our goal that through regular labelling (entering ground-truth observations after each event) we will deliver highly automatized crop damage evaluation application. Development will run in parallel in close cooperation with services where service chaining is expected. Activities will be supported by SINERGISE and JSI, the first offering support in platform usage and the second providing support on the usage of streaming learning methods.

Demonstration USE-CASE

Demonstration USE-CASE Will be realised through integration of PerceptiveSentinel services into Danish precision farming system. The integration will be realised by L&F’s research department (SEGES), which will be during the integration fully supported by SINERGISE (offering training and advisory service about PerceptiveSentinel’s integration services + providing hot-fixes to integration services). Integration into one of SEGES’ (SEGES is L&F research and development department) application will be used to demonstrate external chaining capabilities of PerceptiveSentinel platform. SEGES software suite consists of a range of products which all stores their data in the Danish Field database. The Danish field database is thus the cornerstone of the SEGES software solutions and is the infrastructure through which all the products are connected. The key product is the MarkOnline field management software, which is used for 85% of all Danish field management plans. SEGES is currently developing the CropManager platform within the project “FutureCropping”. CropManager will be an integrated data platform that integrates spatial data from different sources: farmer’s own data from the Danish field database, government data such as soil texture maps, remote sensing data from satellites, and data collected from sensors on farm machinery. CropManager will be able to process and visualize data and to send spatial data, e.g. application rate maps directly to the most common tractor computers. In addition, CropManager will be able to accept output from other data processing sources, e.g. the PerceptiveSentinel services. From PerceptiveSentinel project’s perspective, CropManager will serve as an “external front-office” consuming PerceptiveSentinel services, upgrading them with specific agriculture services and providing a data link with MarkOnline, thus providing high added value to Danish farmers. CropManager is currently on schedule and is projected to be finished by the end of 2018. Second approach to integration will be designed for other user-groups, which may not be willing to invest in the CropManager solution. For this user group, another front-service: CropSat.dk will be used to consume and expose PerceptiveSentinel services - free of charge solution offered to less advanced farmers.

EO VAS: Crop YIELD

PerceptiveSentinel platform services will be employed to design and publish EO-VAS: CropYIELD. Crop YIELD will be a forecasting service generating crop yield forecasts automatically. Two types of forecasting will be supported: for (1) micro scale using available precision farming inputs (parcel polygon, type of crop, growth interventions, usage of pesticides, soil structure …) and using specific EO VAS services (moisture CONTENT, crop CYCLE) and for (2) country/regional scale where the service will be based on chain of predecessor services (crop TYPE, crop CYCLE, crop DAMAGE, moisture CONTENT) and data from generalised databases (pedologic data, precipitation maps). Crop YIELD forecasting will be a sophisticated product whose value will increase through its usage as new learning data (the data about actual YIELD) will enter the system at the end of each season. Development will run in parallel in close cooperation with services where service chaining is expected. Activities will be supported by SINERGISE and JSI, the first offering support in platform usage and the second providing support on the usage of streaming learning methods.

EO VAS: Crop TYPE

PerceptiveSentinel platform services will be employed to design and publish EO-VAS: Crop TYPE - streaming machine learning product that will enable recognition of different crop types based on theirs temporal FEATURES. The product will be, as all other EO-QMiner products, earmarked by incremental learning capabilities, growing in power and accuracy during the years of operation. Development will run in parallel in close cooperation with services where service chaining is expected. Activities will be supported by SINERGISE and JSI, the first offering support in platform usage and the second providing support on the usage of streaming learning methods.

Data management plan

First version of Data Management Plan (DMP) will be written in month 2. During the project, DMP will be regularly updated and in the project after-life it will become a part of PerceptiveSentinel Policy document. Main features addressed through DMP Will be: - Data sharing: data sharing policies will follow rules of Open Research Data Pilot, providing an open access to research data generated through this project. Privacy and data ownership will be accounted as well. Following principles will apply: (1) data generated through project research activities will be openly available, (2) all the data which is available free of cost (for instance SENTINEL data) will be available at the same terms also through PerceptiveSentinel platform and (3) all of the other data will be (or not be) available on the terms set by data owner. The only exception of this rule is represented in PerceptiveSentinel’s DEMO REGION, where we will tend to provide ALL data free-of-charge (following special agreements with data owners). The described principles will assure that all data, required to VERIFY project deliveries, will be openly available. - Data format of both the EO and in-situ data: this part will describe the main characteristics of the data and their provenance. The following macro-information will be managed for each data source and data set: (1) data-set reference and name, (2) data-set description, (3) data-set scope and goal for the project. - Data protocols: protocols will be specified to be used for data exchange within the PerceptiveSentinel system, and between the PerceptiveSentinel system and the external world. Worldwide spatial standards (OGC WMS, WFS, WCS, GeoJSON) and non-spatial (XML) standards will be used. - Catalogue data - metadata on all available data (historical, current and future) and available EO VAS will be provided in a form of Data Catalogue, which will inform users on availability of data and EO products in their target areas. Standard metadata formats will be privileged (OGC Catalogue Services, INSPIRE where relevant), while ad-hoc ones will be considered only if strictly necessary (to describe specific EO VAS). - Archiving and preservation; Rolling archive infrastructure (data archive, which keeps adding new data with old ones remaining available for predefined amount of time) will be implemented on CLOUD infrastructure. Data driven models will be implemented on the data-input side - automatic download of SENTINEL data will be provided for areas with active subscriptions and for PerceptiveSentinel’s DEMO REGION. The data will be stored for the whole subscription period. 3 months after the subscription expires, the data will be archived and then erased from the production database. Presentation data, meant to be used as a background data layer (Level-3A data with the global coverage) will be refreshed once a month.

Streaming-learning validation report

Validation of streaming learning capabilities of EO-QMiner will be performed in parallel with validation and demonstration activities in WP6 and WP7. Accuracies of implemented models will be analysed in different scenarios, changing sequences of time-series data, analysing accuracy in dependency of data-series length (forecasting), in analysing accuracy in dependency with missing data (cloud coverage), etc. The results will be summarised in Streaming-learning validation report.

Veröffentlichungen

Democratising Earth Observation Big Data With eo-learn:Application to Water-Level Monitoring

Autoren: Matej Aleksandrov Matej Batic Miha Kadunc Grega Milcinski Rok Mocnik Devis Peressutti Blaz Sovdat Anze Zupanc Klemen Kenda
Veröffentlicht in: 2018
Herausgeber: KDD 2018, August 2018, London, U.K

Crop classification using PerceptiveSentinel

Autoren: KOPRIVEC, Filip, ČERIN, Matej, KENDA, Klemen. Crop classification using PerceptiveSentinel. V: MLADENIĆ, Dunja (ur.), GROBELNIK, Marko (ur.). Odkrivanje znanja in podatkovna skladišča - SiKDD : zbornik 21. mednarodne multikonference Informacijska družba - IS 2018, 8. in 12. oktober 2018, Ljubljana, Slovenia : zvezek C = Data mining and data warehouses - SiKDD : proceedings of the 21st Interna
Veröffentlicht in: 2018
Herausgeber: SiKDD : proceedings of the 21st International Multiconference Information Society - IS 2018, October 8-12, 2018, [Ljubljana, Slovenia]

Spatio-Temporal Deep Learning: Application to Land CoverClassication

Autoren: M. Lubej, M. Aleksandrov, M. Batič, M. Kadunc, G. Milčinski, D. Peressutti, A. Zupanc
Veröffentlicht in: 2018
Herausgeber: https://www.researchgate.net/publication/333262625_Spatio-Temporal_Deep_Learning_An_Application_to_Land_Cover_Classification

Suche nach OpenAIRE-Daten ...

Bei der Suche nach OpenAIRE-Daten ist ein Fehler aufgetreten

Es liegen keine Ergebnisse vor