The Internet of Things in Archeology

Original author: Erica Barone, Daniele Antonio Maggio
  • Transfer
I want to introduce you to Stas Pavlov , our IoT expert, who selected several interesting technological stories that appeared in the framework of Microsoft Technical Case Studies. All the results of work on it can be found on GitHub . This material made Stas interested in the fact that IoT technologies were used in an unusual place - to monitor the state of ancient Mitreum in Rome. Inside you will find a little history of Ancient Rome and a lot of Internet of things.

Series of Microsoft Technical Case Studies articles

1. Power BI Embedded, IoT and machine learning for processing brain thermograms .
2. As in Canada, looking for missing children .
3. The Internet of things in archeology .
4. Loading ...


This project relates to the study of an important archaeological site of Rome - the Mitreum, located under the stands of the Circus Maximus. The work was carried out in conjunction with the Technical University of Milan (PoliMi) - a scientific and technical university that graduates engineers, architects and industrial designers. In addition to the university, the City Department of Cultural Heritage (Sovrintendenza Capitolina) and the University of Trieste also took part in the work.

We collected the readings of sensors installed inside the Mitreum to remotely monitor the concentration of carbon dioxide, the level of vibration, temperature and humidity.

Mitreum has been preserved in good condition to this day. It was first discovered in the 1930s during the restoration of the Roman Opera House. The building dates from the II century BC. e. She probably rebuilt many times. In the III century AD e. its lower floor was reserved for the sanctuary - the Mitreum.

Mitreum at the Circus Maximus (Rome)

The following IoT devices and technologies were used in the project:

  • Special sensors installed in the Mitreum: accelerometer (on the device), thermometer (MCP9700A), hygrometer (808H5V5), carbon dioxide sensor (TGS4161), soil thermometer (PT1000) and photoresistor (VT43N2).
  • Libelium Waspmotes as a computing module for sensors.
  • Raspberry Pi 2 running Raspbian OS with Mono software and C # console application as a field gateway.
  • The 802.15.4 standard for low-speed wireless networks for connecting sensors and a gateway.
  • Azure IoT Hub for receiving data: one message every 30 minutes.
  • Azure Stream Analytics to transfer data to the database and Power BI.
  • Azure SQL Database for data storage.
  • Power BI Embedded.
  • Public data visualization feature in Azure App Service.

Formulation of the problem

The main objective of this project is remote monitoring of the current state of the Mitreum using special sensors installed inside this architectural monument. Among them are accelerometers, carbon dioxide, temperature and humidity sensors. Data is transmitted to the field gateway, and then to the cloud infrastructure in order to publish it in a visual form on a regular web page.

Data visualization on the website

For PoliMi, Mitreum monitoring was the first project on the way to creating a reference system for collecting data from archaeological sites. In the future, it will use various types of sensors, gateways and a single cloud architecture built for this project, so IoT technologies were used in this project, potentially useful for future developments.

The main goals of this project are:

  • Find a scalable solution that can be applied at different archaeological sites.
  • Integrate cloud infrastructure with gateways and sensors installed locally. Solve the compatibility problems that arose in the framework of this project.
  • Find a solution so that any user can view monitoring data from anywhere.

Sensors and Gateway

First of all, we held several meetings on Skype in order to more accurately determine the tasks and understand the current status of the project. When we started creating cloud architecture, PoliMi employees already selected the sensors and installed them in the Mitreum.

Arrangement of sensors in the Miterum

At regular intervals (depending on the selected frequency in milliseconds), the sensors send small data sets to the field gateway, where they are processed locally. As can be seen from the table below, the size of each message is quite small, so every 30 minutes the data is grouped into a single file in .txt format containing all the sensor readings. The C # console application processes the resulting file, after which the structured data is sent to the Azure IoT Hub.

Characteristics of sensors in Mitreum

Based on this data, we were able to calculate the monthly cost of the entire cloud infrastructure prior to its deployment using the Azure price calculator. This initiative, highly praised by the client, allowed us to continue working on the project. Of course, other advantages played a role: easy deployment, scaling and porting of the solution, support for the project by highly qualified Microsoft employees.

Cost calculation

Project development challenges

One of the most serious tasks in the framework of this project was data analysis. It was necessary to separate the readings of all the sensors collected in a text file in order to create the correct message for sending to the IoT Hub. The specialists of PoliMi University helped us a lot in solving this problem. Having clearly understood and defined the structure of the message, we were able to correctly visualize the data.

Another major challenge was integrating the cloud infrastructure with sensors and a gateway.
We worked together with a client using various forms of communication: several Skype conferences, personal meetings, remote developments and discussions. In total, we spent about 10 full working days on all stages of the project, including design, development and testing. During this time, we organized the implementation of the system and created a comprehensive solution.

In-person meeting with Kostab and Luke

Solution and its stages

As described above, four types of sensors send data to the field gateway at the frequency indicated in the solution diagram below.

The network gateway connects to the Azure IoT Hub and sends about 1 KB of data every 30 minutes. We chose MQTT as the protocol, since it is widespread and convenient for use with the node models of this project.

Azure Stream Analytics processes the data stream and sends the measurement results to the appropriate SQL database table.

The web application publishes Power BI reports so that end users (specialists from PoliMi) receive visualized data.

Project architecture diagram

Data for analysis

To collect data, 10 Libelium Waspmote sensors were used . Their type was chosen by archaeologists and the staff responsible for the restoration of the facility. Libelium sensor models were used for the Waspmote computing device, including:

The readings are transferred to the Raspbery Pi 2 device, which runs the Raspbian operating system and acts as a field gateway.

The choice of OS was determined by the preferences of the client, who had extensive experience working with the Raspbian OS and almost did not use Windows 10 IoT Base and other operating systems. The device processes the data received from the sensors. For example, for values ​​collected from nodes 3 and 4 (accelerometers), a fast Fourier transform (FFT) is performed, which is added to the message sent to the cloud.

A Python application launched on the gateway generates a text file every 30 minutes with all the data collected. Choosing the Python programming language turned out to be the best way for Costab Dolui to minimize efforts and simplify the replication of the solution on various platforms in the future. This file will be used to send data to the IoT Hub. The image below shows a sample .txt file with raw data that will be parsed by the C # console application on the field gateway.

Raw Text File

Preliminary processing

To send the necessary data, you must convert the text file in a console application running on the field gateway. At this stage, the question arose of choosing the language for developing the console application. The IoT Hub SDK is available for both Python and C #. Having discussed both options with the client, we decided to develop the application in C #, since we knew this language better. Then we installed Mono software on the Raspberry Pi device to launch the application.

This application creates a JSON array of measurements, ready for sending to the IoT Hub, containing the readings of all sensors extracted from a text file. During the final test, we used a very large .txt file and found some problems related to the size of messages sent to the IoT Hub.

If the text file is larger than 256 KB, then it should be divided into small files. This requirement arose because of IoT Hub restrictions on the maximum packet size of data transmitted from the device to the cloud.

Scheme for parsing text data

Creating a tool for parsing the readings of nodes 3 and 4 was complicated by the nature of the data sent. They had to transmit the following information to the field gateway: time stamp, node ID, battery level, coordinates, packet number, and a set of 20 numbers that are part of the FFT. Data transmission from one sensor for specific coordinates consists of seven packets with time stamps within 70 milliseconds. We had to identify these packets and create one row in the table, including all the FFT values ​​of the seven packets (represented as a JSON array and stored together as a string).

Telemetry transmission

As mentioned earlier, due to the nature of the connection, first of all, we had to process a text file containing sensor readings. Real-time monitoring was not the goal of this project, so the final model functions with a slight delay. In the end, we decided not to refuse the text file created by the field gateway, because it can be useful to the client as a log file. We also continued to use the developed parser.

Raspberry Pi works as an opaque field gateway, so we registered the C # console application as a unique device for the IoT Hub. This means that no matter how many sensors we connect to it, the gateway will perform all the necessary manipulations, aggregate the data and create a message for sending to the IoT Hub. In addition, this approach will allow reuse of the developed architecture for other archaeological sites and create a scalable and easily portable solution.


Azure IoT Hub allows you to register each device with a name and a symmetric key. This means that each device receives its own connection string.

If it is hacked, the corresponding device can be turned off by stopping data exchange with it using the control panel on the Azure portal.

IoT Hub provides a turnkey effective solution for protecting transmitted data. Before starting work on the project, no safety requirements were presented, however, this function was widely recognized by the client.

Telemetry Storage with Stream Analytics

Data transferred to the Azure IoT Hub is subject to storage and visualization. After discussing with the client, we decided to create a database with four different tables, one for each type of sensor. We chose the SQL database, because it can directly query data in PowerBI Embedded.

Azure Stream Analytics is then configured with one inbound stream and four outbound streams, as seen in the screenshot below.

Azure Stream Analytics The

query consists of four operators that send telemetry data to the corresponding table in the SQL database. For instance:

	dateadd(s, ts, '1970-01-01') as ts,
WHERE id=1 OR id=2

As you can see, the function is dateaddused to convert time from UNIX format to readable form. In the rest of the expression, data is read from the stream and stored in the output mapping table accsample.

Data visualization

The final stage of the project is visualization. To simplify data analysis, we decided to embed a Power BI report in an ASP.NET web application. Due to this, we were able to realize two important goals of this project: to display data on charts that the client can easily personalize, and, most importantly, to open public access to sensor readings.

The charts, graphs and data filters available in the web application were created in close collaboration with the client, so that the result met expectations: various tabs for various sensors, filters, and so on.

Another example of specific requirements: filtering data by timestamp for each device.

Data visualization using PowerBI Embedded


We were able to develop a comprehensive solution to the Internet of things to obtain data from an important archaeological site and remotely track its status. Of course, this is of great importance to scientists. Now they will have access to the necessary information from anywhere.

In addition, the implemented architecture can be easily scaled and migrated. Specialists of the Technical University of Milan wanted to figure out how to use these technologies and adapt the solution to other projects. Joint work has allowed us to achieve these goals.

Data visualization is very simple and straightforward. Researchers can use it to obtain information about the state of an archaeological site.


A comprehensive solution for remote monitoring of Mitreum in the Circus Maxim has been successfully implemented. The project team has acquired all the skills necessary to create it. Over the next few months, researchers will test and evaluate the effectiveness of the system in order to understand how to improve it, given the specific needs that they face.

Moreover, the PoliMi team has gained skills in working with the Azure platform and can easily recreate a similar architecture for various remote monitoring projects related to other archaeological sites.

We remind you that you can try Microsoft Azure for free here .

Additional Resources

GitHub repository for the C # console application running on the field gateway
Information about the Mithra at the Circus Maximus website Sovrintendenza Capitolina

Business.IoT: Discovering the Internet of Things

March 30, 2017 will host the Microsoft Online Conference “Business.IoT: Discovering the Internet of Things” . The program will have two parallel tracks (business and technology) with best practices and recommendations from leading experts in the field of the Internet of things, machine learning and predictive analytics.

Among the speakers will be: Mikhail Chernomordikov (Microsoft), Sergey Osipov (MAYKOR-GMCS), Dmitry Bergelson (GuaranaCam), Anna Kulashova (Microsoft), Dmitry Marchenko (Microsoft), Andrey Meluzov (KORUS Consulting) and Vasily Yesipov ( KPMG).

To participate, you must register here .

Also popular now: