1. Brief Introduction to Bosch IoT Insights

Field Data is an essential asset for understanding the reality that products are exposed to. This understanding is a prerequisite to improve products and to keep them competitive on the market. But field data for products is not available automatically. It needs effort to build up a sustainable process and environment for collecting and using field data. Bosch IoT Insights therefore provides a cloud infrastructure, REST APIs and a web application user interface for storing, querying and isolating data for further investigation and needs. It is further based on state of the art technologies, such as MongoDB for example.


2. Getting Started

2.1. Service Booking

In case you want to try out the service and need a Free service plan, please use the Bosch IoT Suite portal to place your subscription:

  1. Starting point is https://bosch-iot-suite.com/

  2. Click My account and Sign in with your Bosch ID. In case you have no Bosch ID yet, feel free to register a new user account. Once the authentication is successful, you will be re-directed to the Bosch IoT Suite portal.

  3. Click New Subscription and select Bosch IoT Insights:

    1. Select the service plan: Currently the Free plan is the only option offered with this channel

    2. Set the name of your instance: You can book multiple instances of each type of service so make sure you set a speaking name.

    3. Click Subscribe

    4. The Status will be “Provisioning” while the service subscription mechanism checks the validity, e.g: the instance name is unique

    5. In case the Status is still not “Active”, click the Refresh button after some seconds.

  4. Check your Credentials

  5. To Edit the service configuration or visit the Dashboard, click the respective action button.

2.2. Bosch IoT Insights

2.2.1. Welcome to your IoT data management project with Bosch IoT Insights

Discover the functions of Bosch IoT Insights quickly and easily with our helpful tutorials. They will provide you with valuable help and useful tips, enabling you to gain practical experience with the system.

Use our sample data and a simple use case to get to know Bosch IoT Insights.

We offer you two tutorials, differing in terms of length and complexity:

  1. Quick intro (about 5 min)
    The quick intro is restricted to the essentials. Start with the quick intro if you already have some basic understanding of the topic.

  2. Detailed introduction (about 30 mins)
    The detailed introduction comprises several individual steps and includes extensive descriptions. It provides you with deeper insights into the technical/functional processes of Bosch IoT Insights, enabling you to gain a broader understanding of the software and how to work with it in the best possible way.

What to do first

  1. Register your Bosch ID

  2. Register an instance of Bosch IoT Insights Services (Service Booking)

  3. Download the demo data: watermeter_201807.zip
    The JSON file contains the demo data for the water consumption of different housing units.

Are you ready? Then let’s get started!

2.2.2. Quick Intro

In this tutorial you will upload initial data to Bosch IoT Insights Service, evaluate the data sets, and generate a graphic presentation of the results of the analysis.

Step1 ENG short Step2 ENG short Step3 ENG short Step4 ENG short

Step 1 - Data Upload
  1. To start uploading data, click on the Services tab.

  2. Select Data upload.

  3. Upload the data set provided. (Download: watermeter_201807.zip)

Manual Data Upload
Step 2 - Create a template

Now you can begin aggregating and selecting the data according to your requirements.

To do this, you need to create your first template:

  1. Select the Explore tab, then select Data Explorer.

  2. Click Add Template.

  3. Give your template a Name and Description

  4. Make sure “Share with other project members” is set.

  5. Copy the prefabricated text from the template into the command line.

  6. Save your template by clicking on Create new template.

Template Text: Water consumption per flat and day

[{
        "$unwind": "$payload.measurements"
    },
    {
        "$match": {
            "payload.measurements.timestamp": {
                "$gte": {
                    "$date": "2018-06-30T22:00:00.000Z"
                },
                "$lte": {
                    "$date": "2018-07-30T22:00:00.000Z"
                }
            }
        }
    },
    {
        "$project": {
            "_id": 0,
            "metadata": -1,
            "meter": "$payload.measurements.meter",
            "value": "$payload.measurements.value",
            "timestamp": "$payload.measurements.timestamp"
        }
    },
    {
        "$match": {
            "meter": "Flat 1"
        }
    },
    {
        "$group": {
            "_id": {
                "meter": "$meter",
                "day": {
                    "$dateToString": {
                        "format": "%Y-%m-%d",
                        "date": "$timestamp"
                    }
                }
            },
            "value": {
                "$sum": "$value"
            }
        }
    },
    {
        "$project": {
            "_id": 0,
            "Flat No": "$_id.meter",
            "Day": "$_id.day",
            "Liter": {
                "$trunc": "$value"
            },
            "Measurements": "$count"
        }
    },
    {
        "$sort": {
            "Flat No": 1,
            "Day": 1
        }
    }
]
Template Definition
Step 3 - Visualization

Navigate to the overview of your dashboard.

  1. Select Add widget to insert your first graphic.

  2. Select the last widget, Query result.

  3. Add a Name and select the template from Step 2.

Query result
  1. Click on the Output tab to determine further graphics settings.

  2. Select Display mode – Chart.

Query result display mode
  1. Go to Chart Config.

  2. Use the Add column function to add all three parameters (Unit no., Day, Liters) to your graphic.

  3. Select your preferred presentation form from the eight different variants available (recommended: Vertical Stacked Bar Chart) and click on Add to add it to your dashboard.

Query result Verical Stacked Bar Chart
Step 4 - Dashboard configuration

Navigate to the overview of your dashboard.

  • Use Start ordering to position (drag & drop) your graphics anywhere on your dashboard.

  • Use Activate editing to make any additional changes to your graphics.

overview

2.2.3. Detailed Introduction

DiStep1 ENG long DiStep2 ENG long DiStep3 ENG long DiStep4 ENG long DiStep5 ENG long

Step 1 - Data Upload
DiOverview

Your account does not yet contain any data that we could use for analysis. That’s why we have provided you with test data (watermeter_201807.zip), which you have already downloaded.

  1. In order upload this data to Bosch IoT Insights, go to the menu item Services.

  2. Select Data upload and then Add files.

  3. After selecting the ZIP file, click on Upload.

Manual Data Upload

During the upload process, all the raw data from your devices is stored in the Input Data database and, simultaneously, the system begins processing that data. That means the raw data is checked by a generic processor. The processor unzips the ZIP file and records the data saved there (in our case it’s the watermeter201807.json file).

In order to add additional task in our Data Processing Pipeline ie.

  • Decoding the data set using our various decoder formats (ODX, FIBEX, A2L, DBC)

  • Checking every data set for any errors, omissions and inconsistencies.

  • Cleaning up the data set to ensure they are valid and suitable for further processing.

  • Transforming the data set so that it has a uniform structure and is subsequently saved.

you need to implement a custom processor which is yet not possible in our FREE PLAN.

Bosch IoT Insights   data management process
Step 2 - Input history

After the data has been saved, it can be analyzed in more detail. In the future, your various devices will send a constant stream of data to Bosch IoT Insights. This data is also referred to as input data.

  1. To view the input data, click on the tab Explore – Input History. It provides you with a summary – like a catalog of data received – that contains, in its original format, all the data uploaded or transferred. It also shows you when the system received the data. You can see what device the data came from, the size of the files, and the current status of individual data sets.

Input History
  1. Click on the Data Browser tab. As already mentioned, all input data is processed during upload. The data browser shows you the entire content of all processed data.

  2. By clicking on Expand all, you can view all of the information in a data set in a tree structure. You can choose between different forms of presentation: tree, text or table.

Data Browser
Step 3 - Create a template

You have uploaded your data to Bosch IoT Insights and were able to take a closer look at it in Step 2. The following provides more details of how Bosch IoT Insights evaluates your data.

  1. Click on the menu item Explore – Data Explorer.

The data explorer works like the Google search engine. You use the Google search bar to find what you’re looking for. The same applies to the data explorer: it enables you to find out certain things about your data. The search is done by means of templates. Templates consist of individual operators, which are small snippets of code. You can read, search and select your data using a “Search request.” The aggregated “Search results” are displayed on your dashboard in the form of graphics.

Here’s an example: You want to know the average water consumption of a particular unit. The avg operator (see MongoDB) allows you to aggregate all water consumption data in the system and use it to calculate an average figure. This figure is then output in graph form on the dashboard.

  1. Now create your first template. Click on Add template. Your template needs a name and a short description. Example: name: Water consumption as per date range Unit1; description: Get the water consumption for Unit 1 for a specific date range. The actual text of the templates is created using the code languages MongoDB Aggregation Reference and Freemarker Language Reference. MongoDB Framework is a database tool that can handle extremely large volumes of complex data. Bosch IoT Insights thus offers you an optimal basis for your future IoT projects.

  2. The template text for our example is shown below. Copy this text and insert it in the appropriate field:

Template text: Water Consumption per Daterange Flat1

[{
        "$unwind": "$payload.measurements"
    },
    {
        "$match": {
            "payload.measurements.timestamp": {
                "$gte": {
                    "$date": "2018-06-30T22:00:00.000Z"
                },
                "$lte": {
                    "$date": "2018-07-30T22:00:00.000Z"
                }
            }
        }
    },
    {
        "$project": {
            "_id": 0,
            "metadata": -1,
            "meter": "$payload.measurements.meter",
            "value": "$payload.measurements.value",
            "timestamp": "$payload.measurements.timestamp"
        }
    },
    {
        "$match": {
            "meter": "Flat 1"
        }
    },
    {
        "$group": {
            "_id": {
                "meter": "$meter",
                "day": {
                    "$dateToString": {
                        "format": "%Y-%m-%d",
                        "date": "$timestamp"
                    }
                }
            },
            "value": {
                "$sum": "$value"
            }
        }
    },
    {
        "$project": {
            "_id": 0,
            "Flat No": "$_id.meter",
            "Day": "$_id.day",
            "Liter": {
                "$trunc": "$value"
            },
            "Measurements": "$count"
        }
    },
    {
        "$sort": {
            "Flat No": 1,
            "Day": 1
        }
    }
]
  1. Do you want to allow other employees to view and access this template? If so, check the box Other project members may see this template.

  2. Once you have finished creating your template text and fixing your settings, click on Create new template to save it.

Template Definition
Step 4 - Visualization

The next step is to visualize your data. To do this, return to the overview.

The overview also doubles as your dashboard – where you can monitor all your data.

What do you see on your dashboard at the moment?

  • On the left-hand side is the item Project Stats. It shows:

    • the file size of your current project (Project size),

    • the number of individual data sets the project currently contains (Input count), and

    • how many of these have already been processed (Processed count).

  • A detailed summary of the file size is shown in the middle. You can see the volume of all the data transferred to your system (Input Data). As you already know, the system processes your input data during upload.

  • The bottom bar shows you the size of the file containing that processed data (Processed Data).

The dashboard consists of a number of configurable widgets that enable you to display, in graphic form, the data you select. The widgets show only the data you have selected using your template.

We are constantly developing and expanding the widget functions in Bosch IoT Insights so as to be able to offer you the very latest data visualization options at all times.

  1. Click on Add widget to create your first widget. You can choose between four different kinds of widget.

  2. Use the Select button to choose Query result. This widget allows you to transform prefabricated template texts into graphics. Set what size the widget will have on your dashboard (recommended: Full) and give it a name. Example: Water consumption as per date_range Unit1.

  3. Under Data source you can select the data set you want to display. Choose the template you created in Step 3.

  4. Now click on the Output tab to customize presentation of the data.

    1. You can choose between a tree diagram, table or chart. In our example, a chart would be the best mode of presentation. Select that option.

Query result Query result display mode

  1. After that, a third tab – Chart Config – appears. Click on that.

  2. Under Column Path, you can determine which parameters are displayed in the graphic. Select all three of the parameters available (Flat no., Day, Liters) using the Add column button.

  3. Once you have done that, a menu will appear under Select Chart. Here you can choose from among eight different presentation variants. Select Vertical Stacked Bar Chart.

  4. Add additional information to complete your graph. Click on true (unhide) or false (hide) to add, for example, a legend or labels for individual axes or groups of data. In our example, we select all available types of additional information.

  1. Complete this process by saving your chart on the dashboard using the Add button.

Query result Verical Stacked Bar Chart
Step 5 - Dashboard configuration

Configure the dashboard to suit your requirements.

  1. Use the Activate editing and Start ordering buttons at the edge of the screen to customize your dashboard.

    1. Editing allows you to access the basic settings of individual widgets in order to change their content or visual presentation.

    2. The drag-and-drop function under Ordering allows you to place the widgets wherever you like on the dashboard.

  2. If you are happy with your changes, you can save them using the Disable editing or Finish ordering buttons.

overview

3. Account Management

3.1. Registration Process

To register to Bosch IoT Insights, the user has to ask his Insights project manager to send him an invitation email. When received, the user can follow the link to start the registration process. On the forwarded page, the First Name, the Last Name, an unique Username and a secure Password of the user are required. After confirming by pressing the Activate Account button a User has been activated message appears. The user will also find a link to login.

Please note that invitations are only valid for a certain time period. The expiration date is shown in the invitation email.

3.2. Password Reset

When a password has been forgotten, it can be changed by accessing the Forgot Password link within the login page. The user must then enter the affected Username and confirm the action by clicking Reset Password. The user will receive an email with a forwarding link where a new password (which needs to differ from the ten last used passwords) can be defined.

Please note that passwords must be secure. Therefore, the chosen password (at least 8 characters long) must contain upper and lower case letters, a number and a special character.

3.3. User Management

The User Management, found in the Admin section, is accessible for every user having the manager or admin role to a given project. It offers the possibility to invite, remove, list or change roles of a user.

Task Description

Invite a new user

Enter the email address and press Send invitation. After the invitation is send the new user appears with the status INACTIVE in the user list below. The role can now be set in the Main role column. After the user completes the registration process the status will be changed to ACTIVE.

Remove a user

Press the X button on the right side of the user and confirm the request.

Change a user role

Select the new role in the Main role column. No confirmation is needed.

Changes is the User Management will only take effect after a new login.

By clicking the "E-Mail all visible" button an email to all email addresses visible on the page can be send. To write to more users at once, set the users per page limit to a higher number in the list settings.

The search function is useful to find a specific user or a group of users (i.e. all users of the same company (domain) entered in the search field).

Here is a list of roles, their description, their inherited roles (whose access rights are automatically applied) and sample features (access rights) which can be granted.

Role Description Feature Examples Inherited Roles

Admin

Technical administrator of a specific project. Is allowed to change settings which might break the project setup.

Additional to Manager Role:

  • delete input and processed data

  • full access to all project features

  • Manager

  • Power User

  • User

  • Data Provider

Manager

Business administrator of a specific project. Is only allowed to change settings which cannot break a project setup.

Additional to Power User Role:

  • view database statistics

  • invite users to a project

  • access user management

  • Power User

  • User

  • Data Provider

Power User

Role for power users of a specific project with extended features.

Additional to User Role:

  • use Data Analyzer

  • write Queries within the Template Designer

  • access Decoder Service

  • User

User

Role for users of a specific project. Allowed to access data of this project.

Minimum Role to:

  • access project data

  • use Data Browser and Data Explorer

  • see custom project views

Data Provider

Role for (mostly technical) users of a specific project. Allowed to provide/send input data.

Role to:

  • send input data to a project only


4. User Interface

The Bosch IoT Insights website uses Cookies and similar technologies, such as HTML5 Storage, to enable authentification and store user preferences. This allows easier navigation and a high degree of user-friendliness for this website.

You have to enable Cookies in your browser for www.Bosch-IoT-Insights.com

4.2. Login

To log in to Insights, go to www.Bosch-IoT-Insights.com. You will be presented a login screen where you can enter your credentials. Enter your Username and Password (as specified during the account registration) and click "Sign In" to start using Bosch IoT Insights. If you want to change your password (for security reasons or if you don’t remember it), click "Forgot Password" to create a new one.

4.3. Landing Page

The landing page offers the user an overview over common Insights information, its features and the possibilities of Bosch IoT Insights.

4.4. User Settings

From every page, the User Settings and the Language Preferences can be accessed in the upper right corner. The language can either be set to English or to German, whereas the browser language is set as default. Within the User Info, the roles of the logged in user can be seen. Further on, the user can download additional artefacts, get insight into user guides and log off.

4.5. Dashboards

As a user with power user rights (see User Management) you can customize the "project home dashboard" of your project and other user dashboards (views), if available in your project.

Your project home and other "standard dashboards" are composed of widgets. Those widgets are separate areas (or tiles) which can display static or dynamic content such as text, statistics, navigational links or query results.

The tiles that make up your dashboard are visually responsive. That means, that they adjust to the size of the users screen, depending on how you configure them. (see Displaying Options)

To get started with customizing your dashboard, you have three buttons at the bottom of each customizable dashboard:

ui dashboard dashboard buttons

Which give you the following options:

4.5.1. Adding Widgets to Your Dashboard

When you click this button, you get a menu with four options:

ui dashboard create widget types
  • Rich Text
    Allows you to create an informative tile with static text and images

  • Project Stats
    Creates a dynamic widget which shows the current numbers and sizes of different data accumulated in your project

  • Link Panel
    Create a collection of links, either automatically generated or manually defined by you (or both)

  • Query Results
    Let a result of a query be displayed as table, tree or chart. As "data provider" you can define an aggregation query or select an existing query template.

4.5.2. Edit Widgets on Your Dashboard

To change or delete the widgets on your dashboard, you have to go into "edit mode" by clicking "Activate editing". Now, when hovering over a widget you will see a dashed line around it and the "edit" icon ( ui dashboard edit icon ) in the upper right corner.

By clicking this icon, you will get to the settings of that widget, where you essentially have the same options as when creating the widgets: You can change the title, appearance and the content of the widget. You also have the option to delete the widget. (You cannot change the type of a widget, as technically they are different things! You have to delete a widget and create a new one to accomplish that.)

4.5.3. Arranging (Ordering) the Widgets on Your Dashboard

You can arrange your widgets very easily in the "ordering mode": Just click "Start ordering" and the widgets will "collapse" into low blocks, showing just their heading. This makes it easier for you to keep an overview of all your widgets.

To reorder your widgets, just drag them around and drop them where you want. If you are happy with your arrangement or want to see how it looks, click "Finish ordering".

ui dashboard arrage widgets

4.5.4. Configuring Your Widgets

When you create or edit a widget you have different options to change its presentation and behaviour.

Displaying Options

The first two options are the same for every widget: You can choose the widget size and whether to show the widget or not. Under "Widget size" you choose how much of the available space (or width) your widget should take. You have the following options:

ui dashboard displaying options

When you place two "half" widgets after another (or four quarters or two quarters and one half or …​) they will be displayed in one line. Of course, if you put two "quarters" and one "two thirds" the last widget will have to go in the next line.

The dashboards have a responsive layout which means that they adjust to the size of the users display. So, on smaller screens the widgets will stack on top of each other to guarantee readability of the contents.

You can test how the widgets behave by playing around with the window size of your browser!

Under "Show widget" you simply can choose if a widget should be displayed or not. So, if you currently don’t want to show a widget but you don’t want to delete it (i.e. because you wrote a nice aggregation query for it), you can just hide it and "keep it for later".

Titles

All widget types (except for rich text widgets) encompass a title. This is the heading which will describe the content of the widget in large letters. By default, the title is in English. You can define titles for other languages supported in Bosch IoT Insights, by clicking on the "language dropdown" to the right.

If you didn’t define a title for the language selected by the user, the standard title (EN) will be used.

ui dashboard widget titles
Rich Text

The Rich Text widget lets you create static contents with text and images. It provides a basic set of the most useful text processing tools, like:

ui dashboard rich text
  • Text formatting: Make text bold or italic, underline it or strike it through

  • Mark text as quote or as code blocks

  • Define level 1 and level 2 heading (H1, H2)

  • Create ordered or unordered (bullet point) lists

  • Indent text (tabs)

  • Adjust text and background color

  • Align text (left, right, center or justify)

  • Clear all formatting

  • Add weblinks and pictures

Project Stats

Project Stats widgets are pretty straightforward: As for every widget, you can select the width and whether you want to show the widget or not. And you can define a title. The only other option is the display mode, meaning what should be shown by the widget:

  • Data size + Input count + Processed count:
    Shows the size of used storage (MB, GB) in the project, and the numbers of stored input and processed documents

  • Collection type sizes chart
    Shows the sizes of processed and input data and other data used by the project in a bar chart

  • Collection type count chart
    Shows the number (document count) of processed and input data and other data used by the project in a bar chart

ui dashboard project stats

In the end this will look like this

Project Stats Example

The Link Panel lets you create automatically generated or manually defined links. You can manually define as many links as you want. Just click the "Add Link"-Button at the bottom. For each link you can define:

  • Label (the "heading" of the Link)

  • Description (more detailed info about the link)

  • Link label (the display name of the link)

  • URL (the location or website to which the link leads)

For external Links prepend the http:// or https://

While none of the attributes are required you should at least define a Link Label and a URL.

For Labels, Descriptions and Link Labels you can define versions for all available languages.

In addition to manual links you can let the Link Panel create links to

  • all available Explore Views (like Data Explorer, Data Browser or Input History) or

  • all available Project Views (custom project views)

If you do this, your manual links will be shown at the end of the link list.

Query Results

With the Query Result widget you can present the result of a query as a table, tree or chart. As a data source you can select an existing query template or define a seperate aggregation query.

Under "Data Source" you can choose "Query template" or "Aggregation query". If you select Query template, click on the "Query Template" field to select a predefined query template. (Query templates are defined under "Explore" → "Data Explorer".)

Query templates have to be shared with other project members to be used here!

In case you select "Aggregation query", you can choose the collection to query (usually "Processed Data") and define the query itself (a.k.a. "Aggregation query") See Data Browser and especially MongoDB Query Service for more info about the Data Browser, MongoDB Queries and the special syntax to use.

Under "Output" you can choose the visualization of the query result:

  • Table

  • Tree

  • Chart

4.6. Manual Data Upload

In some cases it might be necessary to insert data manually into your project. Therefore, navigate to Data Upload, which can be found at the Services tab of the navigation bar.

First, you need to select your target project to which the data shall be uploaded. You can now select a single file or multiple files at once by clicking the Add Files button. The selected files will then be displayed, which gives you the possibility to remove or upload single files or all of the files and to add more files. The Upload or Upload All button will then trigger the actual file upload.

During and after the upload, the user is given visual feedback whether the upload and processing of the files was successful or not. You can remove the feedback boxes by clicking Close in the upper right corner of every box or by clicking Remove All at the bottom of the page.

Clicking Remove All will not only remove the visual feedback boxes, but it will also remove queued files (which are waiting for their upload) and files that are currently being uploaded.

On one hand, the Upload Status illustrates the capability of the backend to receive and store the raw input data. On the other hand, the Processing Status represents the result of the further data processing after a successful upload. It might occur that the upload of the raw data succeeds whereas the following processing fails. In this case, the user is given detailed feedback of the error. If the processing of the uploaded input files fails repeatedly, please validate that the selected files correspond to the project format. If the processing of the valid data still fails, please contact the Insights support. It is good practice to attach the detailed error message so that the Insights support gets as detailed information as possible.

4.7. Common Data Exploration

4.7.1. Data Browser

The Data Browser offers the possibility to view and browse all data of a project as simple as possible. Therefore, the data visualization result displays right after selecting a project. Further on, the data can be displayed as plain text or with the help of a collapsible JSON tree view. Since no own aggregation queries can be expressed here, the Data Browser represents the entry point for viewing the data of a project as easy as possible.

A detailed view of the REST request, which sent the query in the background, can be evaluated when activating the Show Request Details checkbox. The sent HTTP request against the Insights backend (MongoDB Query Service in fact) will be displayed then, along with its HTTP request method, its header details and its request body.

Further on, the pagination buttons allow a single view of each dataset or an aggregated view of all datasets:

Single View of each Dataset:

{
    "_id": "123",
    "_class": "sample.class.1",
    "metaData": {
        "key": 1
    },
    "payload": {
        "key": "value"
    }
}

Aggregated View of all Datasets:

[
    {
        "_id": "123",
        "_class": "sample.class.1",
        "metaData": {
            "key": 1
        },
        "payload": {
            "key": "value"
        }
    },{
        "_id": "456",
        "_class": "sample.class.2",
        "metaData": {
            "key": 2
        },
        "payload": {
            "key": "value"
        }
    },{
        "_id": "789",
        "_class": "sample.class.3",
        "metaData": {
            "key": 3
        },
        "payload": {
            "key": "value"
        }
    }
]

4.7.2. Data Explorer

The Data Explorer helps to answer use case specific questions by offering predefined queries. The user only needs to provide few input parameters like date ranges, IDs or limits. Under the hood, a complex query is executed which was prepared as a so called query template by a domain expert. Further on, new query templates can be added, using the Add Template button.

4.7.3. Data Analyzer

The Data Analyzer represents the expert mode for sending queries against the project-specific MongoDB collections of the backend. Therefore, a query editor with text highlighting and autocompletion offers the possibility of exploring and tagging own aggregation queries (please regard that MongoDB queries for Insights must comply with the extended JSON format). Since the recently developed query is stored in the local storage, it reappers when the user returns to the query editor. The default query, when an user accesses the page for the first time, is shown below as a backup:

[
    {
        "$match": {}
    },
    {
        "$limit": 10
    }
]

Next to showing the query results within the user interface (Run button), the results can be downloaded to the local computer of the user (Download button). Doing so, the user can choose between the two export formats JSON and CSV. For converting already downloaded JSON data to CSV on the local computer, see this section for further information. Moreover, statistics of the project-specific collections are displayed next to the query editor.

Please look up the FAQs if code snippets cannot be pasted from the clipboard into the textfield.

Further information concerning the MongoDB aggregation query framework can be found here:

4.7.4. Template Designer

The Template Designer allows power users to prepare queries for the Data Explorer. New query templates can be added within the Data Explorer, using the Add Template button. A query template mainly consists of a MongoDB query and input parameters (please regard that MongoDB queries for Insights must comply with the extended JSON format). The input parameters need to be provided by the user in the Data Explorer afterwards. When submitting the request, the data input parameters are processed with the template to create and execute the resulting query. See this section for further technical details.

Please look up the FAQs if code snippets cannot be pasted from the clipboard into the textfield.

Further information concerning the MongoDB aggregation query framework can be found here:

4.7.5. Query History

The Query History lists all user-specific queries and therefore represents a history of sent queries along with their details. Queries can further be removed or opened within the Data Analyzer once again when selecting one particular query. Filtering queries is possible with their given Tag Names.

4.8. Project Specific Pages

Next to the predefined pages of this section, users (depending on their assigned projects) might be able to access additional project or use case specific pages (such as project dashboards for example), which can also be found within the top navigation bar.


5. REST APIs

5.1. General Topics

5.1.1. Authentication Options

For requesting the RESTful services directly with a REST client, authentication must be ensured. Therefore, three authentication procedures are available:

Basic Authentication:
Within the header of a HTTP request, the Authorization field must be set. Therefore, the user credentials must be encoded with Base64 and need to be included like this: Authorization: Basic base64(User:Password). The following example shows an authorization header for the credentials foo:bar. Further information can be found here.

...
Authorization: Basic Zm9vOmJhcg==
...

Api Key:
Another possibility is to use an api key: Therefore, the api key needs to be included within the HTTP x-im-apikey header. The following example shows a HTTP header for an example UUID api key. Further information can be found here.

...
x-im-apikey: <UUID api key>
...

5.1.2. Working behind a Proxy-Server

Most companies use a proxy server for routing all outgoing internet traffic. If you work behind a proxy server, you have to configure your HTTP client to use it. Your internet browser should already be set up correctly by your IT department, hence you should find the proxy settings there. This also means that you don’t have to configure your proxy settings if you use a browser plugin as your HTTP client. Please also consider that some proxy servers require authentication furthermore. For details, please contact your IT department.

5.1.3. Cookies

The Bosch IoT Insights services use Cookies to enable authenticated user sessions and to prevent security threats. Your HTTP client should have Cookies enabled in order to send them with every subsequent request!

One Cookie called XSRF-TOKEN needs special care. Its purpose is to protect against Cross-Site-Request-Forgery. The content value of this Cookie needs to be sent in a HTTP Header called X-XSRF-TOKEN on all subsequent requests!

5.1.4. HTTPS

The Bosch IoT Insights services can only be accessed over HTTPS. When accessing over HTTP, an automatic redirect to HTTPS is done. However, you should not rely on the HTTPS redirect when using the APIs because sensitive information could already be leaked at the first request.

Always use HTTPS URLs when accessing the Insights services.

5.1.5. API Versioning

Our service APIs are accessed by specifying a version in the URL, e.g. …​/service-name/v1/action. We continuously improve and extend our APIs while trying to stay compatible with your client. Therefore we increase the API version whenever we introduce new changes which would break the previous version.

You are not enforced to immediately update your client whenever we introduce new API versions. However we encourage you to update your client as soon as possible as we don’t give long term support for old versions.

5.1.6. Swagger Documentation

We use Swagger to document our APIs. This includes a JSON-based API description and a modified version of the Swagger-UI to explore the API. The links to the Swagger-UI are provided in the corresponding section for each service.

The Swagger-UI always points to the latest service [API version]. However there is a dropdown menu at the top navigation to switch the shown [API version].

5.2. HTTP Data Recorder Service

5.2.1. Overview

The HTTP Data Recorder is a generic data ingestion service which allows a client to send data to the Insights backend. All raw data is kept stored in the system. The upload of new data triggers the processing of that data which results in a new processed dataset which is stored separately from the input data. The target project and content-type of the data needs to be given. The service accepts any content-type in general (such as JSON, XML or ZIP), but it is the responsibility of the project specific processor to handle the input correctly.

5.2.2. Example

The following HTTP POST request shows an example request for a project with name 'demo', using the basic authentication of the credential foo:bar and sending JSON content.

HTTP POST https://bosch-iot-insights.com/data-recorder-service/v2/demo

Header:
Content-Type: application/json
Authorization: Basic Zm9vOmJhcg==

Body:
{
  "hello" : "world"
}

5.3. MongoDB Query Service

This service requires your HTTP client to enable Cookies to work properly! Please see our notes about Cookies.

5.3.1. Overview

The MongoDB Query Service allows to execute MongoDB aggregation queries against the processed data of a project. Furthermore, some additional features, like storing query templates or providing on overview of the query history, are offered. A project can have more than one collection for processed data, hence the target collection needs to be given as well when executing queries.

5.3.2. MongoDB Extended JSON

Since all MongoDB queries from within Insights (UI and REST API) require to be written with extended JSON, this chapter provides an introduction to it and shows an example usage. This leads directly to the question of the necessity of an extended JSON format: Since raw JSON was not invented for storing documents with various data types (but to serialize data with less overhead than XML), MongoDB stores documents using BSON, the binary version of JSON, giving it the possibility to preserve data types. Under the hood, this makes sorting, comparing and indexing of documents way more efficient. Further on, only strict mode is supported by Bosch IoT Insights, meaning that any JSON parser would still parse strict mode representations of key/value pairs but only the internal parser of MongoDB recognizes actual type information.

Bosch IoT Insights MongoDB queries therefore need to comply with the extended JSON format. This requires to transcribe usual MongoDB queries, which other clients (like MongoDB Shell for instance) could handle. For most Insights queries, mainly two data types are relevant:

  • Date and its associated representation { "$date": "<date>" }

  • ObjectId and its associated representation { "$oid": "<id>" }

The following examples show how queries need to be adapted to comply with the extended JSON format. Doing so, exemplifications of the two essential data types (Date and ObjectId) are given below.

Conventional MongoDB query using the Date type:

{
    $match:{
        metaData.timestamp:{
            $lte:ISODate('2017-10-01T15:00:00.000Z')
        }
    }
}

Adapted MongoDB query using the Date type (compliant to extended JSON):

{
    "$match":{
        "metaData.timestamp":{
            "$lte":{
                "$date":"2017-10-01T15:00:00.000Z"
            }
        }
    }
}

Conventional MongoDB query using the ObjectId type:

{
    $match:{
        _id:ObjectId("5955f9618c9cdc0010e618f0")
    }
}

Adapted MongoDB query using the ObjectId type (compliant to extended JSON):

{
    "$match":{
        "_id":{
            "$oid":"5955f9618c9cdc0010e618f0"
        }
    }
}

The following Java code snippet shows how a query, which uses BSON types, can be converted to an extended JSON query. It therefore utilizes MongoDB’s BSON library. The resulting String extendedOutputQuery could now be used from within the Bosch IoT Insights UI or when calling the REST APIs directly:

String inputQuery = "{$match:{_id:ObjectId(\"5955f9618c9cdc0010e618f0\")}}";
Document bsonDocument = Document.parse( inputQuery );
String extendedOutputQuery = bsonDocument.toJson( new JsonWriterSettings( JsonMode.STRICT ) );

// The String extendedOutputQuery results in:
// { "$match" : { "_id" : { "$oid" : "5955f9618c9cdc0010e618f0" } } }

5.3.3. Example for synchronous query execution

The example below shows a HTTP POST request for executing a MongoDB aggregation query against a demo collection. Please note that this is a synchronous call which returns the query result immediately. This might lead to HTTP timeouts if the query takes too long. The authentication header is additionally shown for the example credentials of foo:bar.

HTTP POST https://bosch-iot-insights.com/mongodb-query-service/v1/queries

Header:
Content-Type: application/json
Accept: application/json
Authorization: Basic Zm9vOmJhcg==

Body:
{
    "aggregate": [
        {
            "$limit" : 10
        }
    ],
    "collection" : "demo_processed_data"
}

5.3.4. Example for asynchronous query execution

The example below shows a HTTP POST request for executing a MongoDB aggregation query asynchronous. This is required for long running queries (> 10 seconds) to not run into a HTTP timeout. The authentication header is additionally shown for the example credentials of foo:bar.

HTTP POST https://bosch-iot-insights.com/mongodb-query-service/v1/queries

Header:
Content-Type: application/json
Accept: application/json
Authorization: Basic Zm9vOmJhcg==

Body:
{
    "aggregate": [
        {
            "$limit" : 10
        }
    ],
    "collection" : "demo_processed_data",
    "async": true
}

The resulting document shows the query status, which is "PENDING" at first.

{
  "id": "59b78c8e32501c0015c28b5a",
  "requestType": "ASYNC",
  "collection": "demo_processed_data",
  "type": "AGGREGATE",
  "query": [
    {
      "$limit": 1
    }
  ],
  "createdAt": "2017-09-12T07:28:14Z",
  "status": "PENDING"
}

Use the following request to poll the query status. Replace the id in the URL with the one given from the first response. Poll the status as long as it changes to "SUCCESSFUL".

HTTP GET https://bosch-iot-insights.com/mongodb-query-service/v1/queries/{id}

Header:
Accept: application/json
Authorization: Basic Zm9vOmJhcg==

As soon as the query has completed, you can fetch those results using the following request. Again, replace the id in the URL with the one of your query-request.

HTTP GET https://bosch-iot-insights.com/mongodb-query-service/v1/queries/{id}/result

Header:
Accept: application/json
Authorization: Basic Zm9vOmJhcg==

5.3.5. Query Templates

Overview

MongoDB queries can sometimes get quite complex and not everybody is an expert in writing such queries. It is therefore required that expert users prepare those queries in order to make them reusable for other users. However, some parts of a query might require dynamic user input, for example the match criteria of a query usually involves date ranges, numerical limits, etc. which need to be provided for every single query execution. The solution to this problem are so called query templates. A query template is a partially complete MongoDB query, which can be processed by a template engine to generate a fully valid MongoDB query (please regard that MongoDB queries for Bosch IoT Insights must comply with the extended JSON format). The query template syntax allows parameter placeholders, loops, boolean conditions or even simple arithmetics. The template engine then processes the query template together with some given user input to generate the complete MongoDB query.

Template Engine

We use the FreeMarker template engine for processing query templates, which offers a very rich template syntax. The following gives an overview of the most important features.

Parameters:
Parameters are needed to allow user specific input for each query execution which is based on a template. A parameter has a name, a data type and a parameter type. The parameter name has to be unique within the template. The data type defines the type of the parameter value and currently the following are supported:

  • String: a simple text value

  • Int: an integer value

  • Float: a floating point value

  • Boolean: a boolean value

  • Timestamp: an ISO 8601 string, for example 2017-07-08T12:00:00.000Z

The parameter type defines the way, parameter values have to be given and currently the following are supported:

  • Scalar: a single parameter value

  • Range: a from-to value range

  • List: a list of parameter values

  • Map: a key-value mapping, while the key is always a string and the value is of the given data type

Within the query template, a parameter placeholder can be defined with ${parameter}, where parameter would be the name of the parameter. The template engine then replaces ${parameter} with the given value of the parameter.

Example for Scalars:
The following query template contains a placeholder for a parameter called user. The query template would have to define a parameter called user with data type String and parameter type Scalar:

[
    {
        "$match": {
            "name": "${user}"
        }
    }
]

When processing the template, the placeholder for ${user} is simply replaced with the given input value for that parameter.

Example for Ranges:
The following query template contains a placeholder for a parameter called measurementDate. The query template would have to define a parameter called measurementDate with data type Timestamp and parameter type Range:

[
    {
        "$match": {
            "timestamp": {
                "$gte": {
                    "$date": "${measurementDate.from}"
                },
                "$lte": {
                    "$date": "${measurementDate.to}"
                }
            }
        }
    }
]

When processing the template with a value { "from": "2017-07-01T12:00:00.000Z2, "to": "2017-07-10T12:00:00.000Z" }, the resulting query would look like this:

[
    {
        "$match": {
            "timestamp": {
                "$gte": {
                    "$date": "2017-07-01T12:00:00.000Z"
                },
                "$lte": {
                    "$date": "2017-07-10T12:00:00.000Z"
                }
            }
        }
    }
]

Conditional Query Fragments:
Fragments of a query template can be made conditional, using the if-directive. This directive can check for parameter values or if a value is present at all.

Example for conditions:
The following query template contains a placeholder for a parameter called amountOfDocuments to limit the number of query results. The query template would have to define an optional parameter called amountOfDocuments with data type Int and parameter type Scalar. Additionally, the template defines an if-directive which checks if the desired parameter is given at all and then decides to include the fragment or not:

[
    {
        "$match": {
            "name": "Philipp"
        }
    }
    <#if amountOfDocuments??>
    , {
        "$limit": ${amountOfDocuments}
    }
    </#if>
]

When processing the template with a value of 100, the resulting query would look like the following:

[
    {
        "$match": {
            "name": "Philipp"
        }
    },
    {
        "$limit": 100
    }
]

Otherwise, if the parameter value is not given at all (since it was defined as optional), the resulting query would look like this:

[
    {
        "$match": {
            "name": "Philipp"
        }
    }
]

Loops:
Loops can be used to repeat a query fragment several times using the list-directive. This directive can iterate over parameters of type List or Map.

Example for Lists:
The following query template contains a placeholder for a parameter called fieldNames. The query template would have to define a parameter called fieldNames with data type String and parameter type List. Additionally, the template defines a list-directive to loop over the values of the parameter. The sep-directive is used to include a separator character (a comma in this case) between the list elements:

[
    {
        "$project": {
            <#list fieldNames as fieldName>
            "${fieldName}": 1 <#sep>,</#sep>
            </#list>
        }
    }
]

When processing the template with values [ "timestamp", "position" ], the resulting query would look like this:

[
    {
        "$project": {
            "timestamp": 1,
            "position": 1
        }
    }
]

Example for Maps:
The following query template contains a placeholder for a parameter called fieldNames. The query template would have to define a parameter called fieldNames with data type String and parameter type Map. Additionally, the template defines a list-directive to loop over the key-value-pairs of the parameter. The sep-directive is used to include a separator character (a comma in this case) between the list elements:

[
    {
        "$project": {
            <#list fieldNames as key, value>
            "${key}": "${value}" <#sep>,</#sep>
            </#list>
        }
    }
]

When processing the template with value { "timestamp": "$metaData.timestamp", "position": "$metaData.position" }, the resulting query would look like shown below:

[
    {
        "$project": {
            "timestamp": "$metaData.timestamp",
            "position": "$metaData.position"
        }
    }
]

Example for Optional Lists or Maps:
Sometimes, parameters of type List or Map can also be optional. In this case it is not necessary to use an additional if-directive, but just to split the list-directive into a list- and items-directive. The list-directive then checks if the parameter value is present at all and then iterates over the wrapped items-loop. Imagine the example for lists from above, which also could use an optional list parameter.

[
    {
        "$match": {
            "name": "Philipp"
        }
    }
    <#list fieldNames>
    , {
        "$project": {
            <#items as fieldName>
            "${fieldName}": 1 <#sep>,</#sep>
            </#items>
        }
    }
    </#list>
]

5.3.6. Download Query Results

Query results can not only be inspected within the user interface, they can also be downloaded to your local computer (for further usage within a specific software or for manual inspection for example). Technically, the download can be triggered two ways: The preferred approach is to set the HTTP Accept header, telling the Bosch IoT Insights backend which MIME type of the response is expected. Another approach is to append the format parameter to the requested URL (both approaches are described below). The user interface furthermore offers the possibility to select predefined CSV formats or to apply custom CSV formats easily, using a selection dialog. Within this dialog, special CSV formats (like the standardized RFC 4180 or Excel-ready formats) can be selected when a CSV download is requested. But the user can also configure own CSV export formats, specifying the output formatting according to one’s needs. The available parameters, as well as their default values and their influence on the output CSV file, are listed below.

Please note that the default format, if no other predefined or custom format is requested, follows the RFC 4180 standard (which is also explained below).

Approach 1: HTTP Accept Header
When requesting the MongoDB Query Service (also see here), the Accept header of the request specifies the expected MIME type of the response. When triggering a CSV download, a sample Accept header could therefore look like this:

...
Accept: text/csv
...

Furthermore, predefined formats are also supported, such as:

...
Accept: text/vnd.sfde.uniview+csv
...

As well as the possibility to specify own CSV formats within the Accept header:

...
Accept: text/csv;columnSeparator=tab;decimalSeparator=comma;dateTimeFormat="dd.MM.yyyy HH:mm"
...

Whereas the header for requesting a JSON file download would look like this:

...
Accept: application/json
...

Approach 2: URL Format Parameter
If the setting of HTTP headers is not possible or appropriate, the requested URL can be extended with the format parameter. Note that the arguments need to be URL encoded, as well as the date time formatting parameter needs to be enclosed with quotation marks. Therefore, a sample URL could look like this:

...
https://bosch-iot-insights.com/mongodb-query-service/v1/queries/{id}/result?format%3Dtext%2Fcsv
...

Or with appended parameters:

...
https://bosch-iot-insights.com/mongodb-query-service/v1/queries/{id}/result?format%3Dtext%2Fcsv%3BdateTimeFormat%3D%22dd.MM.yyyy%20HH%3Amm%22
...

Approach 3: CSV Export Dialog
As already stated above, the user has the comfortable possibility to specify the CSV download format via a dialog in the UI.

Available Download Formats

Bosch IoT Insights offers the possibility to set the following MIME types, whereas the standard text/csv MIME type can be appended with parameters (as explained above):

  • text/csv

  • text/vnd.sfde.excel.de+csv

  • text/vnd.sfde.uniview+csv

  • application/json

  • application/zip

CSV Formatting Parameters

To satisfy different needs and formats, Insights offers eight parameters with which the CSV output can be modified. The following table lists the parameter names, the default values (when only text/csv is set within the header/URL), the possible arguments and the result within the generated CSV:

Parameter Name Default Value Possible Arguments CSV result

charset

[UTF-8]

all available standard Java charsets, e.g.

  • [UTF-16]

  • [UTF-32]

  • [US-ASCII]

  • etc.

file encoding

locale

[en]

  • [en]

  • [de]

  • english date time names

  • german date time names

header

[present]

  • [present]

  • [absent]

  • header as first row

  • first dataset as first row

lineBreak

[CRLF]

  • [CRLF]

  • [LFCR]

  • [LF]

  • [CR]

  • [\r\n]

  • [\n\r]

  • [\n]

  • [\r]

columnSeparator

[comma]

  • [comma]

  • [semicolon]

  • [tab]

  • [,]

  • [;]

  • [\t]

masking

[false]

  • [false]

  • [true]

  • whatever.value.xxx

  • "whatever.value.xxx"

decimalSeparator

[dot]

  • [dot]

  • [comma]

  • [.]

  • [,]

dateTimeFormat

[yyyy-MM-dd HH:mm:ss]

all valid date time formats

formats date time fields

Predefined CSV Formats

When no parameters are added to the HTTP Accept header or to the format URL parameter, Insights checks if the MIME type and subtype matches a predefined format. The predefined CSV formats are specified below:

RFC 4180
  • MIME type: text/csv

  • charset: UTF-8

  • locale: en

  • header: present

  • lineBreak: CRLF

  • columnSeparator: comma

  • masking: false

  • decimalSeparator: dot

  • dateTimeFormat: yyyy-MM-dd HH:mm:ss

Excel DE
  • MIME type: text/vnd.sfde.excel.de+csv

  • charset: UTF-8

  • locale: de

  • header: present

  • lineBreak: CRLF

  • columnSeparator: semicolon

  • masking: false

  • decimalSeparator: comma

  • dateTimeFormat: dd.MM.yyyy HH:mm:ss

UniView
  • MIME type: text/vnd.sfde.uniview+csv

  • charset: UTF-8

  • locale: de

  • header: present

  • lineBreak: CRLF

  • columnSeparator: semicolon

  • masking: false

  • decimalSeparator: comma

  • dateTimeFormat: dd.MM.yyyy HH:mm:ss

5.4. Data Pipeline Management Service

5.4.1. Overview

The Data Pipeline Management Service gives access to input-, processed-, and processing-related data. It is also possible to clean data or to trigger a re-processing of input-data. Please note that this service does not offer extended query features, and most query options are only based on IDs and timestamps.

5.5. Project Management Service

5.5.1. Overview

The Project Management Service gives access to project configuration and associated. Please note that the UI covers most features of this service, so it is very rare that you will have to use it directly over HTTP. New projects can be setup and existing project configurations can be configured and bootstrapped. Also project related data, like master data, RSA keys or decoder files, can be managed by this service. Furthermore, the service provides the project related user management.

5.6. Data Decoder Service

5.6.1. Overview

Next to the user interface (navigate to Decoder at the Services tab within the navigation bar), Bosch IoT Insights also provides a REST API for the Data Decoder Service. Using this feature, Insights offers the possibility to upload specific ODX or FIBEX decoder files (project-specific) and to decode CAN trace lines (in HEX string format), using a given decoder file. It therefore supports users while developing and testing new decoder files and offers the possibility to decode HEX-based CAN trace files on the fast. Note that the test CAN trace files and their decoded output is not stored within the Bosch IoT Insights backend, since it is mainly a service for test purposes, but Insights returns a complete log of every decoding request (or the valid output, of course) to find possible errors.

Please also note that the service within the UI offers the same functionality as the REST interface. But the limitation of the user interface is that it only supports manual tasks, of course. For automated tests or decoding multiple CAN trace files, the exposed REST API is therefore the means of choice.

5.6.2. Default Specification

The following list states the default specifications of the Data Decoder Service (please note that the values are default settings which might have been modified and configured differently):

  • Supported formats: ODX and FIBEX

  • Expiry time of a decoder file (after which the file is deleted automatically by the system): 1 week

  • Maximum file size of decoder specifications: 15 MB

  • Maximum lines of CAN traces per decoding request: 10.000

Further on, the detailed Swagger documentation can be found here, dividing the REST interface into mainly three resources:

  • Decoder: Decode HEX-based CAN trace lines, using a given decoder specification file (ODX or FIBEX)

  • Decoder Requests: Access previous decoder requests (but not the decoded output, since it is not stored)

  • Decoder Specifications: Manage (view, upload or delete) decoder specification files

5.6.3. Examples

Java-based examples of how to deal with the REST API can be found here, showing how to upload a new decoder file and how to test CAN trace lines against it.


6. REST API Code Examples

6.1. Matlab

You can download the following code examples for an (a)synchronous call to the MongoDB Query Service of the Bosch IoT Insights backend as ZIP File and import it to your Matlab installation.

Make sure to set your proxy configuration in Matlab if you are behind a corporate proxy (i.e. within the Bosch network). Therefore go to HOME - Preferences - Web and enter your proxy settings. Within the German Bosch network set the Proxy host to rb-proxy-de.bosch.com and the Proxy port to 8080. The Proxy username and Proxy password are the same as for your Windows login.

Please notice that this examples only work with Matlab 2015b or later.

6.1.1. Sending Data

The following code snippet shows an example call to the HTTP Data Recorder Service of the Insights backend from within Matlab.

function [ ] = MongoDBDataRecorderServiceExample( )
	% Send data to the sFDE DataRecorderService REST-API

	%  ---------   IMPORTANT --------------
	% Don't forget to configure your Matlab with the Bosch proxy server
	% Go to Preferecenes, Web Proxy:rb-proxy-de.bosch.com on port 8080
	% authentifcation is required with your ordinary Windows credentials

	%====CONFIGURATION SECTION ======================================================

	% sFDE Backend URL
	MongoConfig.server = 'https://www.bosch-sfde.com';
	% sFDE service
	MongoConfig.serviceUrl = '/data-recorder-service/v2/demo';

    %====MAKE YOUR CHANGES HERE======================================================

	% Project Username and Password
	MongoConfig.username='YOURUSERNAME';
	MongoConfig.userpassword='YOURUSERPASSWORD';

	% Datsa to send in JSON format
	MongoConfig.data='{"demo":"test"}';

	%================================================================================
	sFDEBaseUrl = strcat(MongoConfig.server,MongoConfig.serviceUrl);
    options = weboptions('MediaType', 'application/json', 'ContentType', 'json', 'Username', MongoConfig.username, 'Password', MongoConfig.userpassword);

	% Place the POST request by using webwrite
    dataStruct = webwrite( sFDEBaseUrl, MongoConfig.data, options);
    disp(dataStruct);

    % YOUR CODE HERE

end

6.1.2. Synchronous query execution

The following code snippet shows an example synchronous call to the MongoDB Query Service of the Insights backend from within Matlab.

function [ ] = MongoDBQueryServiceSynchronousExample( )
	% Run a synchronous MongoDB Query at the sFDE REST-API

	%  ---------   IMPORTANT --------------
	% Don't forget to configure your Matlab with the Bosch proxy server
	% Go to Preferecenes, Web Proxy:rb-proxy-de.bosch.com on port 8080
	% authentifcation is required with your ordinary Windows credentials

	%====CONFIGURATION SECTION ======================================================

	% sFDE Backend URL
	MongoConfig.server = 'https://www.bosch-sfde.com';
	% sFDE service
	MongoConfig.serviceUrl = '/mongodb-query-service/v1/queries';

    %====MAKE YOUR CHANGES HERE======================================================

	% Project Username and Password
	MongoConfig.username='YOURUSERNAME';
	MongoConfig.userpassword='YOURUSERPASSWORD';

	% Aggregation query to execute as db.collection.aggregate(...)
	MongoConfig.query='[{"$limit":1}]';

	% Collection the query should run
	MongoConfig.collection='demo_processed_data';

	%================================================================================
	sFDEBaseUrl = strcat(MongoConfig.server,MongoConfig.serviceUrl);
    options = weboptions('MediaType', 'application/json', 'ContentType', 'json', 'Username', MongoConfig.username, 'Password', MongoConfig.userpassword);

	% Create the string PostParam, that contains the query itself
	PostParam =  strcat('{"aggregate":',MongoConfig.query,',"collection":"',MongoConfig.collection,'"}');

	% Place the POST request by using webwrite
    dataStruct = webwrite( sFDEBaseUrl, PostParam, options);
    disp(dataStruct);

    % YOUR CODE HERE

end

6.1.3. Asynchronous query execution

The following code snippet shows an example asynchronous call to the MongoDB Query Service of the Insights backend from within Matlab.

function [ ] = MongoDBQueryServiceAsyncExample( )
	% Run a asyncronous MongoDB Query at the sFDE REST-API

	%  ---------   IMPORTANT --------------
	% Don't forget to configure your Matlab with the Bosch proxy server
	% Go to Preferecenes, Web Proxy:rb-proxy-de.bosch.com on port 8080
	% authentifcation is required with your ordinary Windows credentials

	%====CONFIGURATION SECTION ======================================================

	% sFDE Backend URL
	MongoConfig.server = 'https://www.bosch-sfde.com';
	% sFDE service
	MongoConfig.serviceUrl = '/mongodb-query-service/v1/queries';

	% Timeout to wait for asyncronous response in seconds
	MongoConfig.MaxTimeOutTime=30;

    %====MAKE YOUR CHANGES HERE======================================================

	% Project Username and Password
	MongoConfig.username='YOURUSERNAME';
	MongoConfig.userpassword='YOURPASSWORD';

	% Aggregation query to execute as db.collection.aggregate(...)
	MongoConfig.query='[{"$limit":1}]';

	% Collection the query should run
	MongoConfig.collection='demo_processed_data';

	%================================================================================
	sFDEBaseUrl = strcat(MongoConfig.server,MongoConfig.serviceUrl);
    options = weboptions('MediaType', 'application/json', 'ContentType', 'json', 'Username', MongoConfig.username, 'Password', MongoConfig.userpassword);

	% Create the string PostParam, that contains the query itself
	PostParam =  strcat('{"aggregate":',MongoConfig.query,',"async":true,"collection":"',MongoConfig.collection,'"}');

	% Place the POST request by using webwrite
    dataStruct = webwrite( sFDEBaseUrl, PostParam, options);
    disp(dataStruct);

	tic
	% loop until MaxTimeOutTime is reached

	while toc<=MongoConfig.MaxTimeOutTime

		%Place GET request to check if the query result is available
		sFDEUrl = strcat(sFDEBaseUrl,'/',char(dataStruct.id));
        dataStruct = webread( sFDEUrl, options);
    	disp(dataStruct);

		%%Check for "SUCCESFULL"
		if strcmp(dataStruct.status, 'SUCCESSFUL')

			%%Send the final GET request to get results
			sFDEUrl = strcat(sFDEBaseUrl,'/',char(dataStruct.id),'/result');
            dataStruct = webread( sFDEUrl, options);
			disp(dataStruct);
			break
		else
			%If result is not availble wait 1 second
			pause(1);
		end
    end

    % YOUR CODE HERE

end

6.2. Java

You can download the following code examples as ZIP File and import them to your development IDE. Make sure to set your username and password as well as the appropriate proxy and the desired Bosch IoT Insights project before running the applications.

This examples requires some external java libraries:

6.2.1. Sending Data

The following code example shows you how to send data to the HTTP Data Recorder Service of the Bosch IoT Insights backend using Java code.

package com.bosch.si.sfde;

import java.io.UnsupportedEncodingException;
import java.nio.charset.StandardCharsets;
import com.sun.jersey.api.client.Client;
import com.sun.jersey.api.client.ClientResponse;
import com.sun.jersey.api.client.WebResource;
import com.sun.jersey.core.util.Base64;

public class DataRecorderServiceExample {
   public static void main( String[] args ) throws UnsupportedEncodingException {

      final String proxyHost = "rb-proxy-de.bosch.com";
      final String proxyPort = "8080";

      String resourceUrl = "https://www.bosch-sfde.com/data-recorder-service/v1/";
      String sfdeProject = "demo";

      String username = "your_username";
      String password = "your_password";
      String authorizationCredentials = generateAuthorizationToken( username, password );

      String contentType = "text/plain";
      String payload = "Hello World";

      System.setProperty( "https.proxyHost", proxyHost );
      System.setProperty( "https.proxyPort", proxyPort );

      WebResource service = Client.create().resource( resourceUrl + sfdeProject );
      ClientResponse response = service.header( "Authorization", authorizationCredentials ).header( "Content-Type", contentType )
            .post( ClientResponse.class, payload );

      System.out.println( response );
   }

   private static String generateAuthorizationToken( String username, String password )
      throws UnsupportedEncodingException {

      return "Basic " + new String( Base64.encode( username + ":" + password ), StandardCharsets.UTF_8 );
   }
}

6.2.2. Synchronous query execution

This code example shows you how to execute a MongoDB aggregation query against a demo collection.

In this example the aggregation query is sourced out to the separate file queryParametersSync.json which is read by the main program in MongoDBQueryServiceSyncExample.java.

Please note that a synchronous call returns the query result immediately and might lead to HTTP timeouts if the query takes too long.

queryParametersSync.json:
{
  "aggregate": [
    {"$limit":1}
  ],
  "async": false,
  "collection": "demo_processed_data"
}
MongoDBQueryServiceSyncExample.java:
package com.bosch.si.sfde;

import java.io.IOException;
import java.io.UnsupportedEncodingException;
import java.nio.charset.StandardCharsets;
import java.nio.file.Files;
import java.nio.file.Paths;

import com.google.gson.Gson;
import com.google.gson.GsonBuilder;
import com.google.gson.JsonElement;
import com.google.gson.JsonParser;
import com.sun.jersey.api.client.Client;
import com.sun.jersey.api.client.ClientResponse;
import com.sun.jersey.api.client.WebResource;
import com.sun.jersey.core.util.Base64;

public class MongoDBQueryServiceSyncExample {

   public static void main( String[] args ) throws IOException {
      final String proxyHost = "rb-proxy-de.bosch.com";
      final String proxyPort = "8080";

      String resourceUrl = "https://www.bosch-sfde.com:443/mongodb-query-service/v1/queries";

      String username = "your_username";
      String password = "your_password";
      String authorizationCredentials = generateAuthorizationToken( username, password );

      String contentType = "application/json";
      String payload = new String( Files.readAllBytes( Paths.get( "src/main/resources/queryParametersSync.json" ) ) );

      System.setProperty( "https.proxyHost", proxyHost );
      System.setProperty( "https.proxyPort", proxyPort );

      WebResource service = Client.create().resource( resourceUrl );
      ClientResponse response = service.header( "Authorization", authorizationCredentials )
         .header( "Content-Type", contentType )
         .post( ClientResponse.class, payload );

      System.out.println( response );

      if ( response.getStatus() == 200 ) {
         System.out.println( parseJson( response.getEntity( String.class ) ) );
      }
   }

   private static String generateAuthorizationToken( String username, String password )
      throws UnsupportedEncodingException {

      return "Basic " + new String( Base64.encode( username + ":" + password ), StandardCharsets.UTF_8 );
   }

   private static String parseJson( String plainString ) {
      Gson gson = new GsonBuilder().setPrettyPrinting().create();
      JsonElement json = new JsonParser().parse( plainString );
      return gson.toJson( json );
   }
}

6.2.3. Asynchronous query execution

The following example shows the entire sequence of an asynchronous MongoDB aggregation query execution. Please note that a asynchronous call returns the query result not immediately. You have to poll the status of the query till it changes to SUCCESSFUL. Then you can fetch the results.

In this example the aggregation query is sourced out to the separate file queryParametersAsync.json which is read by the main program in MongoDBQueryServiceAsyncExample.java.

queryParametersAsync.json:
{
  "aggregate": [
  {"$limit":10}

],
  "async": true,
  "collection": "demo_processed_data"
}
MongoDBQueryServiceAsyncExample.java:
package com.bosch.si.sfde;

import java.io.IOException;
import java.io.InputStream;
import java.io.InputStreamReader;
import java.io.Reader;
import java.io.UnsupportedEncodingException;
import java.nio.charset.StandardCharsets;
import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.file.Paths;
import java.nio.file.StandardCopyOption;

import com.google.gson.Gson;
import com.google.gson.JsonObject;
import com.sun.jersey.api.client.Client;
import com.sun.jersey.api.client.ClientResponse;
import com.sun.jersey.api.client.WebResource;
import com.sun.jersey.core.util.Base64;

public class MongoDBQueryServiceAsyncExample {

   public static void main( String[] args ) throws IOException {
      final String proxyHost = "rb-proxy-de.bosch.com";
      final String proxyPort = "8080";

      String resourceUrl = "https://www.bosch-sfde.com:443/mongodb-query-service/v1/queries";

      String username = "your_username";
      String password = "your_password";
      String authorizationCredentials = generateAuthorizationToken( username, password );

      String contentType = "application/json";
      String payload = new String( Files.readAllBytes( Paths.get( "src/main/resources/queryParametersAsync.json" ) ) );

      System.setProperty( "https.proxyHost", proxyHost );
      System.setProperty( "https.proxyPort", proxyPort );

      ClientResponse response = httpRequestPost( resourceUrl, authorizationCredentials, contentType, payload );

      System.out.println( response );

      if ( response.getStatus() == 200 ) {
         Boolean queryError = false;
         String responseContent = response.getEntity( String.class );

         JsonObject json = new Gson().fromJson( responseContent, JsonObject.class );
         String requestId = json.get( "id" ).getAsString();
         String queryStatus = json.get( "status" ).getAsString();

         System.out.println( "Status: " + queryStatus );

         while ( !queryStatus.equals( "SUCCESSFUL" ) ) {
            InputStream responseStream = httpRequestGet( resourceUrl + "/" + requestId, authorizationCredentials,
               contentType );
            Reader reader = new InputStreamReader( responseStream );
            JsonObject jsonRes2 = new Gson().fromJson( reader, JsonObject.class );
            queryStatus = jsonRes2.get( "status" ).getAsString();
            System.out.println( "Status: " + queryStatus );

            if ( queryStatus.equals( "FAILED" ) || queryStatus.equals( "INCORRECT" ) ) {
               queryError = true;
               break;
            }
         }

         if ( !queryError ) {
            InputStream responseStream = httpRequestGet( resourceUrl + "/" + requestId + "/result",
               authorizationCredentials,
               contentType );
            System.out.println( "Status: LOADING DATA..." );
            Path path = Paths.get( System.getProperty( "user.home" ) + "/queryResponse.txt" );
            Files.copy( responseStream, path, StandardCopyOption.REPLACE_EXISTING );
            System.out.println( "Status: DONE" );
            System.out.println( "Response data was saved at " + path );
         }
      }
   }

   private static ClientResponse httpRequestPost( String url, String auth, String cType, String payload ) {
      WebResource service = Client.create().resource( url );
      ClientResponse response = service.header( "Authorization", auth )
         .header( "Content-Type", cType )
         .post( ClientResponse.class, payload );
      return response;
   }

   private static InputStream httpRequestGet( String url, String auth, String cType ) {
      WebResource service = Client.create().resource( url );
      InputStream response = service.header( "Authorization", auth ).header( "Content-Type", cType )
         .get( InputStream.class );
      return response;
   }

   private static String generateAuthorizationToken( String username, String password )
      throws UnsupportedEncodingException {

      return "Basic " + new String( Base64.encode( username + ":" + password ), StandardCharsets.UTF_8 );
   }
}

6.2.4. Data Decoder Service

The following code shows how to upload a new decoder file for making it available within the Data Decoder Service.

package com.bosch.si.sfde;

import java.io.File;
import java.io.UnsupportedEncodingException;
import java.nio.charset.StandardCharsets;
import java.util.Base64;

import javax.ws.rs.core.MediaType;

import org.glassfish.jersey.media.multipart.FormDataMultiPart;
import org.glassfish.jersey.media.multipart.file.FileDataBodyPart;
import org.glassfish.jersey.media.multipart.internal.MultiPartWriter;

import com.sun.jersey.api.client.Client;
import com.sun.jersey.api.client.ClientResponse;
import com.sun.jersey.api.client.WebResource;
import com.sun.jersey.api.client.config.DefaultClientConfig;

public class DecoderServiceUploadExample {

   public static void main( String[] args ) throws Exception {
      // set proxy settings
      final String PROXY_HOST = "rb-proxy-de.bosch.com";
      final String PROXY_PORT = "8080";
      setProxySettings( PROXY_HOST, PROXY_PORT );

      // set user credentials
      String username = "your_username";
      String password = "your_password";
      String authorizationCredentials = generateAuthorizationToken( username, password );

      // prepare form data upload
      String project = "demo";
      String type = "FIBEX";
      String name = "my-new-decoder-file";
      String comment = "new new decoder file";
      File file = new File( "path_to_your_decoder_spec_file" );

      String resourceUrl = "https://www.bosch-sfde.com/data-decoder-service/v1/" + project + "/decoders";

      DefaultClientConfig defaultClientConfig = new DefaultClientConfig();
      defaultClientConfig.getClasses().add( MultiPartWriter.class );
      WebResource webResource = Client.create( defaultClientConfig ).resource( resourceUrl );

      FormDataMultiPart formDataMultiPart = (FormDataMultiPart) new FormDataMultiPart()
         .field( "type", type )
         .field( "project", project )
         .field( "name", name )
         .field( "comment", comment )
         .bodyPart( new FileDataBodyPart( "file", file ) );

      // send http post request
      ClientResponse clientResponse = webResource
         .accept( MediaType.WILDCARD_TYPE )
         .type( MediaType.MULTIPART_FORM_DATA_TYPE )
         .header( "Authorization", authorizationCredentials )
         .post( ClientResponse.class, formDataMultiPart );
      formDataMultiPart.close();

      System.out.println( clientResponse.getStatus() );
      System.out.println( clientResponse.getEntity( String.class ) );
   }

   private static void setProxySettings( String host, String port ) {
      System.setProperty( "http.proxyHost", host );
      System.setProperty( "http.proxyPort", port );
      System.setProperty( "https.proxyHost", host );
      System.setProperty( "https.proxyPort", port );
   }

   private static String generateAuthorizationToken( String username, String password )
      throws UnsupportedEncodingException {
      return "Basic " +
         new String( Base64.getEncoder().encode( (username + ":" + password).getBytes() ), StandardCharsets.UTF_8 );
   }

}

Next, this code sends a CAN trace line against an already uploaded and available decoder.

package com.bosch.si.sfde;

import java.io.UnsupportedEncodingException;
import java.nio.charset.StandardCharsets;
import java.util.Base64;

import javax.ws.rs.core.MediaType;

import com.sun.jersey.api.client.Client;
import com.sun.jersey.api.client.ClientResponse;
import com.sun.jersey.api.client.WebResource;

public class DecoderServiceDecodingExample {

   public static void main( String[] args ) throws Exception {
      // set proxy settings
      final String PROXY_HOST = "rb-proxy-de.bosch.com";
      final String PROXY_PORT = "8080";
      setProxySettings( PROXY_HOST, PROXY_PORT );

      // set user credentials
      String username = "your_username";
      String password = "your_password";
      String authorizationCredentials = generateAuthorizationToken( username, password );

      // prepare and send http post request
      String project = "demo";
      String type = "fibex";
      String decoderId = "your_decoder_ID";
      String hexInput = "your_HEX_input";

      String resourceUrl = "https://www.bosch-sfde.com/data-decoder-service/v1/"
         + project + "/decoders/" + decoderId + "/" + type + "/test";

      WebResource webResource = Client.create().resource( resourceUrl );
      ClientResponse clientResponse = webResource
         .accept( MediaType.WILDCARD_TYPE )
         .type( MediaType.APPLICATION_JSON_TYPE )
         .header( "Authorization", authorizationCredentials )
         .post( ClientResponse.class, "{\"testDataWithPdu\":[\"" + hexInput + "\"]}" );

      System.out.println( clientResponse.getStatus() );
      System.out.println( clientResponse.getEntity( String.class ) );
   }

   private static void setProxySettings( String host, String port ) {
      System.setProperty( "http.proxyHost", host );
      System.setProperty( "http.proxyPort", port );
      System.setProperty( "https.proxyHost", host );
      System.setProperty( "https.proxyPort", port );
   }

   private static String generateAuthorizationToken( String username, String password )
      throws UnsupportedEncodingException {
      return "Basic " +
         new String( Base64.getEncoder().encode( (username + ":" + password).getBytes() ), StandardCharsets.UTF_8 );
   }

}

6.3. C#

6.3.1. Sending Data

This code example shows you how to send data to the HTTP Data Recorder Service of the Bosch IoT Insights backend using C# code.

public string  uploadFileToSFDE(string ProjectName,
string Username, string Password, string FilePath, string ProxyUrl,
int ProxyPort, string ProxyUserName, string ProxyPassword)
{
         // Switch all follwing webrequests to use TLS 1.2, other protocols (SSL3/TLS1.0/TLS1.1) are not secure anymore
         ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls12;

         // Create A Web request to SFDE
         HttpWebRequest webRequest = (HttpWebRequest)WebRequest.Create("https://www.bosch-sfde.com/data-recorder-service/v2/" + ProjectName);
         webRequest.Timeout = 30000;
         webRequest.Method = "POST";

         // Define your content type e.g. xml, json, ...
         webRequest.ContentType = "application/xml";

         // Set the SFDE password
         webRequest.Headers.Add("Authorization", "Basic " + Convert.ToBase64String
         (System.Text.Encoding.Default.GetBytes(""+ Username + ":" + Password)));

         // If a proxy url is set use it
         if ( ProxyUrl != null )
         {
                 // set user name and password as credentials for the  proxy
                 WebProxy proxyHTTP = new WebProxy(ProxyUrl, ProxyPort);
                 ICredentials proxyCredentials = new NetworkCredential(ProxyUserName, ProxyPassword);
                 proxyHTTP.Credentials = proxyCredentials;
                 webRequest.Proxy = proxyHTTP;
         }

         // Read the content of the given file and convert file data to byte array
         StringBuilder fileContent = new StringBuilder();
         StreamReader fileReader = new StreamReader(FilePath);
         String line = "";
         while ((line = fileReader.ReadLine()) != null)
         {
                fileContent.AppendLine(line);
         }
         fileReader.Close();
         byte[] fileContentBytes = Encoding.UTF8.GetBytes(fileContent.ToString());

         // Attach the content to the output stream
         webRequest.ContentLength = fileContentBytes.Length;
         webRequest.GetRequestStream().Write(fileContentBytes, 0, fileContentBytes.Length);
         webRequest.GetRequestStream().Flush();
         webRequest.GetRequestStream().Close();

         // finally transmit this Web Request to SFDE
         HttpWebResponse webResponse = (HttpWebResponse)webRequest.GetResponse();

         // a Successful answer will be OK returned from the Website
         Console.WriteLine( "WebResponse Contains " + webResponse.StatusDescription );
         if (webResponse.StatusCode == System.Net.HttpStatusCode.OK)
         {
                return true;
         }
         else
         {
                return false;
         }
}

6.4. C++

6.4.1. Sending Data

This code example shows you how to send data to the HTTP Data Recorder Service of the Bosch IoT Insights backend using C++ and libcurl as third party library.

Libcurl is a multiprotocol file transfer library written in C. More Information about libcurl you can find here.

//Include Libcurl as C third party library https://curl.haxx.se/libcurl/
#include "stdafx.h"
#include <curl\curl.h>
#include <iostream>
#include <stdio.h>
#include <string>

using namespace std;
CURL *curl;
CURLcode res;
long responsecode;

string url = "https://www.bosch-sfde.com/data-recorder-service/v2/";
string project = "demo";
const char *headerAccept = "Accept: application/json";
const char *headerContentType = "Content-Type: application/json";
//Encode  username:password in Base64 and input the string after Basic
const char *headerAuthorization = "Authorization: Basic Zm9vOmJhcg==";

//Define a proxy connection if required. Otherwise set the option null;
const char *proxyUrl;
//input your proxy Authentification in following syntax:  username:password
const char *proxyAuth;
//set postFields to send a Post request otherwise libcurl send a get request
string postFields="";

void addHttpHeaders(CURL *curl) {
        struct curl_slist *headerlist = NULL;
        headerlist = curl_slist_append(headerlist, headerAccept);
        headerlist = curl_slist_append(headerlist, headerAuthorization);
        headerlist = curl_slist_append(headerlist, headerContentType);
        curl_easy_setopt(curl, CURLOPT_HTTPHEADER, headerlist);
}

void proxySettings(CURL *curl) {

        curl_easy_setopt(curl, CURLOPT_PROXY, proxyUrl);
        curl_easy_setopt(curl, CURLOPT_PROXYUSERPWD, proxyAuth);
}

CURLcode request(CURL *curl, string postFields) {

string concartUrl = url + project;
        const char *completeUrl = concartUrl.c_str();
        curl_easy_setopt(curl, CURLOPT_URL, completeUrl);
        //This option disable the certifcate Check. It isn't recommended to use this option!
        //curl_easy_setopt(curl, CURLOPT_SSL_VERIFYPEER, 0L);

        //set postFields for post method otherwise libcurl is using http get by default
        if (!postFields.empty()) {
                curl_easy_setopt(curl, CURLOPT_POSTFIELDS, postFields);
        }
        return curl_easy_perform(curl);
}

 boolean checkProjectState() {
         curl = curl_easy_init();
         if (curl) {
                 addHttpHeaders(curl);

                 if ((proxyUrl != NULL) && (proxyUrl[0] == '\0')) {
                         proxySettings(curl);
                 }
                res= request(curl, postFields);

                 /* Check for errors */
                if (res != CURLE_OK) {
                        fprintf(stderr, "curl_easy_perform() failed: %s\n",
                        curl_easy_strerror(res));
                        curl_easy_cleanup(curl);
                        return false;
                }
                else {
                        curl_easy_cleanup(curl);
                        return true;
                 }
         }
         return false;
}
 void sendData() {

         curl = curl_easy_init();

         if (curl) {

                 addHttpHeaders(curl);
                 if ((proxyUrl != NULL) && (proxyUrl[0] == '\0')) {
                         proxySettings(curl);
                 }
                 res = request(curl, postFields);

                 /* Check for errors */
                 if (res != CURLE_OK) {
                         fprintf(stderr, "curl_easy_perform() failed: %s\n",
                                 curl_easy_strerror(res));
                 }else{
                         curl_easy_getinfo(curl, CURLINFO_RESPONSE_CODE, &responsecode);
                         if (responsecode == 201) {
                                 cout << "successful upload" << endl;
                         }
                         else if (responsecode == 304) {
                                 cout << "Uploaded data is marked as duplicate and wasn't proccessed" << endl;
                         }
                         else {
                                 cout << "something works wrong. Server answer with status code: " << responsecode << endl;
                         }
                 }


                 curl_easy_cleanup(curl);
         }
 }
          int main() {
                 curl_global_init(CURL_GLOBAL_ALL);
                 if (checkProjectState()) {

                 postFields = "{\"Hello\": \"World\"}";
                         sendData();

                         cout << "press any key to exit" << endl;
                         string wait;
                         getline(cin, wait);
                          curl_global_cleanup();
                        return 0;

                 }
                 else {
                          cout << "press any key to exit" << endl;
                         string wait;
                         getline(cin, wait);
                         curl_global_cleanup();
                         return 1;
                 }

         }

6.5. Python

You can download the following code examples as ZIP File and import them to your development IDE. Make sure to set your username and password as well as the appropriate proxy and the desired Bosch IoT Insights project before running the applications.

6.5.1. Sending Data

The following code example shows you how to send data to the HTTP Data Recorder Service of the sFDE backend using Python code.

import base64
from urllib.request import Request, ProxyHandler, build_opener

SFDE_USERNAME = '<username>'
SFDE_PASSWORD = '<password>'
PROXIES = {'http': 'http://<username>:<password>@rb-proxy-de.bosch.com:8080',
           'https': 'https://<username>:<password>@rb-proxy-de.bosch.com:8080'}
sfdeProject = 'demo'

def basics_auth(username, password):
    """ Method which return a base 64 basic authentication
    :param username: your SFDE username
    :param password: your password username
    :return: returns string
    """
    credential = username + ':' + password
    encoded_credential = credential.encode('ascii')
    b_encoded_credential = base64.b64encode(encoded_credential)
    b_encoded_credential = b_encoded_credential.decode('ascii')
    b_auth = b_encoded_credential
    return 'Basic %s' % b_auth


def make_request(server_url, post_data=None):
    """ Method which handel a request and return response as json
    :param server_url:  the url for the request
    :param post_data: post data for Post method
    :return: content
    """
    proxy = ProxyHandler(PROXIES)
    opener = build_opener(proxy)
    r = Request(server_url + sfdeProject)

    sfde_b_auth = basics_auth(SFDE_USERNAME, SFDE_PASSWORD)
    r.add_header('Authorization', sfde_b_auth)
    r.add_header('Content-Type', 'application/json')
    r.add_header('Accept', 'application/json')

    r_data = post_data
    r.data = r_data

    handle = opener.open(r)
    content = handle.read().decode('utf8')
    return content


def main():
    try:
        # Server and the api service details
        server = 'https://www.bosch-sfde.com'
        serviceBaseUrl = '/data-recorder-service/'
        service = 'v1/'

        payload = "{Python:'Hello World'}"
        url = server + serviceBaseUrl + service  # construction of the final url for the first POST request
        data = payload.encode("ascii")  # encoding format is ascii for the initial POST request
        res = make_request(url, data)

        print(res)
        print('Data was send to Project:' + sfdeProject)

    except IOError as e:
        print("Problem with the request..")
        print(e)
        print(e.read())

if __name__ == '__main__': main()

6.5.2. Synchronous query execution

This code example shows you how to execute a MongoDB aggregation query against a demo collection.

import base64
import json
from urllib.request import Request, ProxyHandler, build_opener

SFDE_USERNAME = '<username>'
SFDE_PASSWORD = '<password>'
PROXIES = {'http': 'http://<username>:<password>@rb-proxy-de.bosch.com:8080',
           'https': 'https://<username>:<password>@rb-proxy-de.bosch.com:8080'}


def basics_auth(username, password):
    """ Method which return a base 64 basic authentication
    :param username: your SFDE username
    :param password: your password username
    :return: returns string
    """
    credential = username + ':' + password
    encoded_credential = credential.encode('ascii')
    b_encoded_credential = base64.b64encode(encoded_credential)
    b_encoded_credential = b_encoded_credential.decode('ascii')
    b_auth = b_encoded_credential
    return 'Basic %s' % b_auth


def make_request(server_url, method, post_data=None):
    """ Method which handel a request and return response as json
    :param server_url:  the url for the request
    :param method: GET or POST method
    :param post_data: post data for Post method
    :return: returns json
    """
    proxy = ProxyHandler(PROXIES)
    opener = build_opener(proxy)
    r = Request(server_url)

    sfde_b_auth = basics_auth(SFDE_USERNAME, SFDE_PASSWORD)
    r.add_header('Authorization', sfde_b_auth)
    r.add_header('Content-Type', 'application/json')
    r.add_header('Accept', 'application/json')

    r_data = post_data
    r.data = r_data

    handle = opener.open(r)
    content = handle.read().decode('utf8')
    response = json.loads(content)
    return response


def main():
    try:
        # Server and the api service details
        server = 'https://www.bosch-sfde.com:443'
        serviceBaseUrl = '/mongodb-query-service/'
        service = 'v1/queries'
        query = '''
            {
            "aggregate": [
            {"$limit":1}

            ],
            "async": false,
            "collection": "demo_processed_data"
            }
        '''

        url = server + serviceBaseUrl + service  # construction of the final url for the first POST request
        data = query.encode("ascii")  # encoding format is ascii for the initial POST request
        res = make_request(url, 'POST', data)

        print(res)

    except IOError as e:
        print("Problem with the request..")
        print(e)


if __name__ == '__main__': main()

6.5.3. Asynchronous query execution

The following code example shows you how to send an asynchronous MongoDB aggregation query and receive the result in Excel-sheet with Visual Basic. Make sure to set your authorization credentials and if necessary the appropriate proxy settings.

import base64
import csv
import json
import time
from urllib.request import Request, ProxyHandler, build_opener

SFDE_USERNAME = '<username>'
SFDE_PASSWORD = '<password>'
PROXIES = {'http': 'http://<username>:<password>@rb-proxy-de.bosch.com:8080',
           'https': 'https://<username>:<password>@rb-proxy-de.bosch.com:8080'}


def basics_auth(username, password):
    """ Method which return a base 64 basic authentication
    :param username: your SFDE username
    :param password: your password username
    :return: returns string
    """
    credential = username + ':' + password
    encoded_credential = credential.encode('ascii')
    b_encoded_credential = base64.b64encode(encoded_credential)
    b_encoded_credential = b_encoded_credential.decode('ascii')
    b_auth = b_encoded_credential
    return 'Basic %s' % b_auth


def make_request(server_url, method, post_data=None):
    """ Method which handel a request and return response as json
    :param server_url:  the url for the request
    :param method: GET or POST method
    :param post_data: post data for Post method
    :return: returns json
    """
    proxy = ProxyHandler(PROXIES)
    opener = build_opener(proxy)
    r = Request(server_url)

    sfde_b_auth = basics_auth(SFDE_USERNAME, SFDE_PASSWORD)
    r.add_header('Authorization', sfde_b_auth)
    r.add_header('Content-Type', 'application/json')
    r.add_header('Accept', 'application/json')
    if method == 'POST':
        r_data = post_data
        r.data = r_data
        r.get_method = lambda: 'POST'

        handle = opener.open(r)
        content = handle.read().decode('utf8')
        response = json.loads(content)
        return response

    elif method == 'GET':
        r.get_method = lambda: 'GET'
        handle = opener.open(r)
        content = handle.read().decode('utf8')
        response = json.loads(content)
        return response
    else:
        print('Method %s is available' % method)


def main():
    try:
        # Server and the api service details
        server = 'https://www.bosch-sfde.com:443'
        serviceBaseUrl = '/mongodb-query-service/'
        service = 'v1/queries'
        query = '''
            {
            "aggregate": [
            {"$limit":10}

            ],
            "async": true,
            "collection": "demo_processed_data"
            }
        '''

        url = server + serviceBaseUrl + service  # construction of the final url for the first POST request
        data = query.encode("ascii")  # encoding format is ascii for the initial POST request

        res = make_request(url, 'POST', data)
        post_id = res['id']  # retreive the ID of the POST request
        print('post id', post_id)

        # Construct the GET request for POLLING the server using the POST ID
        get_url = "https://sfde-dev.apps.de1.bosch-iot-cloud.com/mongodb-query-service/v1/queries/" + post_id
        print("get_url: ", get_url)
        res_temp = make_request(get_url, 'GET')
        status_temp = res_temp['status']
        print('current status', status_temp)

        # POLLING the server in this loop till the status of the GET request is SUCCESSFUL
        while True:
            print('Entered inside loop')
            res_temp = make_request(get_url, 'GET')
            status_temp = res_temp['status']
            if status_temp == 'SUCCESSFUL':
                break
            print('current status - INSIDE LOOP', status_temp)
            time.sleep(2)

        # If the status is SUCCESSFUL,  we retrieve the results using the POST ID in the json format

        if status_temp == 'SUCCESSFUL':
            get_url_result = "https://sfde-dev.apps.de1.bosch-iot-cloud.com/mongodb-query-service/v1/queries/" + post_id + \
                             "/result?format=application%2Fjson"
            print("get_url_result: ", get_url_result)
            res_final = make_request(get_url_result, 'GET')

            # writing the JSON response to CSV file
            with open("vin_ccu_brake_events.csv", "w", newline='') as file:
                output = csv.writer(file)
                output.writerow(res_final[0].keys())  # header row
                for row in res_final:
                    output.writerow(row.values())
            print("-----------------------------final response--------------------------------")
            print(res_final)
        else:
            print("Error in data processing on the server: ", status_temp)
    except IOError as e:
        print("Problem with the request..")
        print(e)
        print(e.read())


if __name__ == '__main__': main()

6.6. Visual Basic for Excel

6.6.1. Asynchronous query execution

The following code example shows you too how to send an asynchronous MongoDB aggregation query but save the result in Excel with vb . Make sure to set your authorization credentials and if necessary the appropriate proxy settings.

Sub ImportCSVFile(filepath As String)
    ''' <summary>
    ''' Read and CSVFile in Excel-sheet.
    ''' </summary>
    ''' <value>The path of the CSVFile</value>

    Dim line As String
    Dim arrayOfElements
    Dim linenumber As Integer
    Dim elementNumber As Integer
    Dim element As Variant

    linenumber = 0
    elementNumber = 0

    Open filepath For Input As #1 ' Open file for input
        Do While Not EOF(1) ' Loop until end of file
            linenumber = linenumber + 1
            Line Input #1, line
            arrayOfElements = Split(line, ";")

            elementNumber = 0
            For Each element In arrayOfElements
                elementNumber = elementNumber + 1
                Cells(linenumber, elementNumber).Value = element
            Next
        Loop
    Close #1 ' Close file.
End Sub

Public WithEvents newButton As Windows.Forms.Button



Sub AsyncMongoRequest()
    ''' <summary>
    ''' Set header, basic authentication and proxy for the WinHttpRequest.
    ''' Save the HTTPRequest response as csvFile.
    ''' Read the csvFile in Excel-sheet.
    ''' Requirement:
    ''' JsonConverter See: (https://github.com/VBA-tools/VBA-JSON)
    ''' TODO handle catch and exception
    ''' </summary>

    Dim strResult As String
    Dim query As String
    Dim objHTTP As Object
    Dim url As String
    Dim Json  As Object
    Dim status As String
    Dim postId As String
    Dim proxyServer As String
    Dim basicAuth As String
    Dim ntUser As String
    Dim ntPassword As String
    Dim fso As Object
    Dim oFile As Object

    server = "https://www.bosch-sfde.com:443"
    serviceBaseUrl = "/mongodb-query-service/"
    service = "v1/queries"
    Set objHTTP = CreateObject("WinHttp.WinHttpRequest.5.1")
    query = "{""aggregate"": [{""$limit"":10}],""async"": true, ""collection"": ""demo_processed_data""}"
    proxyServer = "rb-proxy-de.bosch.com:8080"
    basicAuth = "Basic " & "<your basic authentication string>"
    ntUser = "<username>"
    ntPassword = "<password>"
    url = server & serviceBaseUrl & service

    objHTTP.Open "POST", url, False
    objHTTP.setRequestHeader "User-Agent", "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.0)"
    objHTTP.setRequestHeader "Content-type", "application/json"
    objHTTP.setRequestHeader "Authorization", basicAuth
    objHTTP.setRequestHeader "Connection", "Keep-Alive"
    objHTTP.setProxy 2, proxyServer, ""
    objHTTP.SetCredentials ntPassword, ntPassword, 1
    objHTTP.send query

    strResult = objHTTP.responseText
    Set Json = JsonConverter.ParseJson(strResult)
    postId = Json("id")

    url = url & "/" & postId
    objHTTP.Open "GET", url, False
    objHTTP.send
    strResult = objHTTP.responseText
    status = objHTTP.status
    Set Json = JsonConverter.ParseJson(strResult)
    status = Json("status")

    If status = "SUCCESSFUL" Then
        url = url & "/result?format=text%2Fvnd.sfde.excel.de%2Bcsv"
        objHTTP.Open "GET", url, False
        objHTTP.send

        strResult = objHTTP.responseText

        Set fso = CreateObject("Scripting.FileSystemObject")

        Set oFile = fso.CreateTextFile("filepath://where you want to store the response")

        oFile.WriteLine strResult
        oFile.Close
        Close

        ImportCSVFile "filepath://where your response is stored"
    Else
        MsgBox "Error in data processing on the server: " & strResult & status
    End If

End Sub

Private Sub Form1_Load() Handles Me.Load
    ''' <summary>
    ''' Create sfdeDownloadButton and
    ''' append AsyncMongoRequest sub function to the button
    ''' </summary>

    newButton = New Windows.Forms.Button
    newButton.Name = "sfdeDownloadButton" & i
    newButton.Top = 20 * 30
    newButton.Left = 40

    AddHandler newButton.Click, AddressOf RequestButton
    Me.Controls.Add(newButton)
End Sub

6.7. Scala

You can download the following code examples as ZIP File and import them to your development IDE. Make sure to set your username and password as well as the appropriate proxy and the desired Bosch IoT Insights project before running the applications.

6.7.1. Sending Data

The following code example shows you how to send data to the HTTP Data Recorder Service of the Insights backend using Scala code.

import java.nio.charset.StandardCharsets
import com.sun.jersey.api.client.Client
import com.sun.jersey.api.client.ClientResponse
import com.sun.jersey.api.client.WebResource
import com.sun.jersey.core.util.Base64

object DataRecorderServiceExample {

  def main( args: Array[String] ): Unit = {

    val proxyHost: String = "rb-proxy-de.bosch.com"
    val proxyPort: String = "8080"

    val resourceUrl: String = "https://www.bosch-sfde.com/data-recorder-service/v1/"
    val sfdeProject: String = "demo"

    val username: String = "your_username"
    val password: String = "your_password"
    val authorizationCredentials: String = generateAuthorizationToken( username, password )

    val contentType: String = "text/plain"
    val payload: String = "Hello World"

    System.setProperty( "https.proxyHost", proxyHost )
    System.setProperty( "https.proxyPort", proxyPort )

    val service: WebResource = Client.create().resource( resourceUrl + sfdeProject )
    val response: ClientResponse = service.header( "Authorization", authorizationCredentials )
      .header( "Content-Type", contentType )
      .post( classOf[ClientResponse], payload )

    println( response )
  }

  private def generateAuthorizationToken( username: String, password: String ): String =
    "Basic " + new String( Base64.encode(username + ":" + password), StandardCharsets.UTF_8 )

}

6.7.2. Synchronous query execution

This code example shows you how to execute a MongoDB aggregation query against a demo collection.

In this example the aggregation query is sourced out to the separate file queryParametersSync.json which is read by the main program in MongoDBQueryServiceSyncExample.scala.

Please note that a synchronous call returns the query result immediately and might lead to HTTP timeouts if the query takes too long.

MongoDBQueryServiceSyncExample.scala:
import java.nio.charset.StandardCharsets
import java.nio.file.Files
import java.nio.file.Paths
import com.google.gson.Gson
import com.google.gson.GsonBuilder
import com.google.gson.JsonElement
import com.google.gson.JsonParser
import com.sun.jersey.api.client.Client
import com.sun.jersey.api.client.ClientResponse
import com.sun.jersey.api.client.WebResource
import com.sun.jersey.core.util.Base64

object MongoDBQueryServiceSyncExample {

  def main( args: Array[String] ): Unit = {

    val proxyHost: String = "rb-proxy-de.bosch.com"
    val proxyPort: String = "8080"

    val resourceUrl: String = "https://www.bosch-sfde.com:443/mongodb-query-service/v1/queries"

    val username: String = "your_username"
    val password: String = "your_password"
    val authorizationCredentials: String = generateAuthorizationToken( username, password )

    val contentType: String = "application/json"
    val payload: String = new String( Files.readAllBytes( Paths.get( "src/resources/queryParametersSync.json" )))

    System.setProperty( "https.proxyHost", proxyHost )
    System.setProperty( "https.proxyPort", proxyPort )

    val service: WebResource = Client.create().resource( resourceUrl )
    val response: ClientResponse = service.header( "Authorization", authorizationCredentials )
      .header( "Content-Type", contentType )
      .post( classOf[ClientResponse ], payload)

    println( response )

    if ( response.getStatus == 200 ) {
      println( parseJson( response.getEntity( classOf[String] ) ) )
    }
  }

  private def generateAuthorizationToken( username: String, password: String ): String =
    "Basic " + new String( Base64.encode( username + ":" + password ), StandardCharsets.UTF_8 )

  private def parseJson(plainString: String): String = {
    val gson: Gson = new GsonBuilder().setPrettyPrinting().create()
    val json: JsonElement = new JsonParser().parse( plainString )
    gson.toJson( json )
  }

}

6.7.3. Asynchronous query execution

The following example shows the entire sequence of an asynchronous MongoDB aggregation query execution. Please note that a asynchronous call returns the query result not immediately. You have to poll the status of the query till it changes to SUCCESSFUL. Then you can fetch the results.

In this example the aggregation query is sourced out to the separate file queryParametersAsync.json which is read by the main program in MongoDBQueryServiceAsyncExample.scala.

MongoDBQueryServiceAsyncExample.scala:
import java.io.InputStream
import java.io.InputStreamReader
import java.io.Reader
import java.nio.charset.StandardCharsets
import java.nio.file.Files
import java.nio.file.Path
import java.nio.file.Paths
import java.nio.file.StandardCopyOption
import com.google.gson.Gson
import com.google.gson.JsonObject
import com.sun.jersey.api.client.Client
import com.sun.jersey.api.client.ClientResponse
import com.sun.jersey.api.client.WebResource
import com.sun.jersey.core.util.Base64

object MongoDBQueryServiceAsyncExample {

  def main( args: Array[String] ): Unit = {

    val proxyHost: String = "rb-proxy-de.bosch.com"
    val proxyPort: String = "8080"

    val resourceUrl: String = "https://www.bosch-sfde.com:443/mongodb-query-service/v1/queries"

    val username: String = "your_username"
    val password: String = "your_password"
    val authorizationCredentials: String = generateAuthorizationToken(username, password)

    val contentType: String = "application/json"
    val payload: String = new String( Files.readAllBytes( Paths.get( "src/resources/queryParametersAsync.json" ) ) )

    System.setProperty( "https.proxyHost", proxyHost )
    System.setProperty( "https.proxyPort", proxyPort )

    val response: ClientResponse = httpRequestPost( resourceUrl, authorizationCredentials, contentType, payload )

    println( response )

    if ( response.getStatus == 200 ) {
      var queryError: Boolean = false
      val responseContent: String = response.getEntity( classOf[String] )

      val json: JsonObject = new Gson().fromJson( responseContent, classOf[JsonObject] )
      val requestId: String = json.get( "id" ).getAsString
      var queryStatus: String = json.get( "status" ).getAsString

      println( "Status: " + queryStatus )

      while ( queryStatus.!=( "SUCCESSFUL" ) ) {
        val responseStream: InputStream = httpRequestGet( resourceUrl + "/" + requestId, authorizationCredentials,
          contentType )
        val reader: Reader = new InputStreamReader( responseStream )
        val jsonRes2: JsonObject = new Gson().fromJson( reader, classOf[JsonObject] )
        queryStatus = jsonRes2.get( "status" ).getAsString
        println( "Status: " + queryStatus )

        if ( queryStatus.==( "FAILED" ) || queryStatus.==( "INCORRECT" ) ) {
          queryError = true
        }
      }

      if ( !queryError ) {
        val responseStream: InputStream = httpRequestGet( resourceUrl + "/" + requestId + "/result", authorizationCredentials,
          contentType )
        println( "Status: LOADING DATA..." )
        val path: Path = Paths.get(System.getProperty( "user.home" ) + "/queryResponse.txt" )
        Files.copy( responseStream, path, StandardCopyOption.REPLACE_EXISTING )
        println( "Status: DONE" )
        println( "Response data was saved at " + path )
      }
    }
  }

  private def httpRequestPost( url: String, auth: String, cType: String, payload: String ): ClientResponse = {
    val service: WebResource = Client.create().resource( url )
    val response: ClientResponse =
      service.header( "Authorization", auth).header( "Content-Type", cType ).post( classOf[ClientResponse], payload )
    response
  }

  private def httpRequestGet( url: String, auth: String, cType: String ): InputStream = {
    val service: WebResource = Client.create().resource( url )
    val response: InputStream =
      service.header( "Authorization", auth ).header( "Content-Type", cType).get( classOf[InputStream] )
    response
  }

  private def generateAuthorizationToken( username: String, password: String ): String =
    "Basic " + new String(Base64.encode(username + ":" + password), StandardCharsets.UTF_8 )

}

6.7.4. Data Decoder Service

The following code shows how to upload a new decoder file for making it available within the Data Decoder Service.

import java.io.File
import java.nio.charset.StandardCharsets
import java.util.Base64
import javax.ws.rs.core.MediaType
import org.glassfish.jersey.media.multipart.FormDataMultiPart
import org.glassfish.jersey.media.multipart.file.FileDataBodyPart
import org.glassfish.jersey.media.multipart.internal.MultiPartWriter
import com.sun.jersey.api.client.Client
import com.sun.jersey.api.client.ClientResponse
import com.sun.jersey.api.client.WebResource
import com.sun.jersey.api.client.config.DefaultClientConfig

object DecoderServiceUploadExample {

  def main( args: Array[String] ): Unit = {
    // set proxy settings
    val PROXY_HOST: String = "rb-proxy-de.bosch.com"
    val PROXY_PORT: String = "8080"
    setProxySettings(PROXY_HOST, PROXY_PORT)

    // set user credentials
    val username: String = "your_username"
    val password: String = "your_password"
    val authorizationCredentials: String = generateAuthorizationToken( username, password )

    // prepare form data upload
    val project: String = "demo"
    val `type`: String = "FIBEX"
    val name: String = "my-new-decoder-file"
    val comment: String = "new new decoder file"
    val file: File = new File( "path_to_your_decoder_spec_file" )

    val resourceUrl: String = "https://www.bosch-sfde.com/data-decoder-service/v1/" + project + "/decoders"

    val defaultClientConfig: DefaultClientConfig = new DefaultClientConfig()
    defaultClientConfig.getClasses.add( classOf[MultiPartWriter] )
    val webResource: WebResource = Client.create( defaultClientConfig ).resource( resourceUrl )

    val formDataMultiPart: FormDataMultiPart = new FormDataMultiPart()
      .field( "type", `type` )
      .field( "project", project )
      .field( "name", name )
      .field( "comment", comment )
      .bodyPart( new FileDataBodyPart( "file", file ) )
      .asInstanceOf[FormDataMultiPart]

    // send http post request
    val clientResponse: ClientResponse = webResource
      .accept( MediaType.WILDCARD_TYPE )
      .`type`( MediaType.MULTIPART_FORM_DATA_TYPE )
      .header("Authorization", authorizationCredentials )
      .post( classOf[ClientResponse], formDataMultiPart )
    formDataMultiPart.close()

    println( clientResponse.getStatus )
    println( clientResponse.getEntity( classOf[String] ) )
  }

  private def setProxySettings( host: String, port: String ): Unit = {
    System.setProperty( "http.proxyHost", host )
    System.setProperty( "http.proxyPort", port )
    System.setProperty( "https.proxyHost", host )
    System.setProperty( "https.proxyPort", port )
  }

  private def generateAuthorizationToken( username: String, password: String ): String =
    "Basic " + new String( Base64.getEncoder.encode( (username + ":" + password).getBytes ), StandardCharsets.UTF_8 )

}

Next, this code sends a CAN trace line against an already uploaded and available decoder.

import java.nio.charset.StandardCharsets
import java.util.Base64
import javax.ws.rs.core.MediaType
import com.sun.jersey.api.client.Client
import com.sun.jersey.api.client.ClientResponse
import com.sun.jersey.api.client.WebResource

object DecoderServiceDecodingExample {

  def main( args: Array[String] ): Unit = {

    // set proxy setting
    val PROXY_HOST: String = "rb-proxy-de.bosch.com"
    val PROXY_PORT: String = "8080"
    setProxySettings( PROXY_HOST, PROXY_PORT )

    // set user credentials
    val username: String = "your_username"
    val password: String = "your_password"
    val authorizationCredentials: String = generateAuthorizationToken(username, password)

    // prepare and send http post request
    val project: String = "demo"
    val `type`: String = "FIBEX"
    val decoderId: String = "your_decoder_ID"
    val hexInput: String = "your_HEX_input"

    val resourceUrl: String = "https://www.bosch-sfde.com/data-decoder-service/v1/" +
      project + "/decoders/" + decoderId + "/" + `type` + "/test"

    val webResource: WebResource = Client.create().resource( resourceUrl )
    val clientResponse: ClientResponse = webResource.accept( MediaType.WILDCARD_TYPE).`type`(MediaType.APPLICATION_JSON_TYPE )
      .header("Authorization", authorizationCredentials )
      .post( classOf[ClientResponse], "{\"testDataWithPdu\":[\"" + hexInput + "\"]}" )

    println( clientResponse.getStatus )
    println( clientResponse.getEntity( classOf[String] ) )

  }

  private def setProxySettings( host: String, port: String ): Unit = {
    System.setProperty( "http.proxyHost", host )
    System.setProperty( "http.proxyPort", port )
    System.setProperty( "https.proxyHost", host )
    System.setProperty( "https.proxyPort", port )
  }

  private def generateAuthorizationToken( username: String, password: String ): String =
    "Basic " + new String( Base64.getEncoder.encode( (username + ":" + password).getBytes ), StandardCharsets.UTF_8 )

}

7. Tools

7.1. Json2Csv Converter

The Json2Csv Converter allows an user to convert JSON data to CSV data offline on a local computer after some JSON data has been downloaded from the Data Analyzer for example. The user can then either use the Graphical User Interface or the Command Line Tool for the converting process. Ahead of the description of both of the tools, this section first gives a brief introduction to the challenge of converting NoSQL data (e.g. JSON) to structured data (e.g. CSV).

This becomes a difficult task when either datasets and their attributes differ or when a field contains more than one value entry (streaming values). When datasets and their attributes differ, either a model dataset can be chosen or the data has to be evaluated twice (a first run for finding all attributes and a second run for handling the actual data). The constellation of having streaming values within a JSON value field cannot properly be transferred to a regular CSV structure. At this point, the user is offered three different options to solve the issue. The following solutions result from this given sample dataset:

[
    {
        "_id": "111",
        "P1": [5, 2, 2, 3, 2],
        "P2": [0.76, 0.67, 0.77, 0.66, 0.65],
        "P3": [3.65, 3.77, 3.78, 3.89, 3.89]
    }, {
        "_id": "222",
        "P1": [4, 4, 2, 2, 5],
        "P2": [0.55, 0.57, 0.75, 0.56, 0.76],
        "P3": [4.55, 3.99, 4.11, 4.12, 4.24]

    }, {
        "_id": "333",
        "P1": [2, 3, 4, 3, 3],
        "P2": [0.45, 0.45, 0.65, 0.45, 0.55],
        "P3": [3.01, 2.89, 2.87, 2.99, 3.21]
    }
]

Path Creation (Default):
Attributes of streaming values are unreeled so that every attribute gets an own full path (with a counter added to its name). Rows therefore still represent single datasets whereas columns get unwinded uniquely (according to the counter) and contain only atomic entries. Streaming values are therefore exported as rows. This converting option is the default method since it generates the only correct output for a regular and valid CSV file.

j2c path creation

Pivoting:
Instead of adding a counter to the path of the attribute, streaming values are pivoted. This means that columns still contain only atomic entries but one row does not represent one dataset anymore. Nevertheless, columns are still unique but several rows have to be merged to get one single data record. Streaming values are therefore exported as columns.

j2c pivoting

Array Compression:
When compressing streaming values (arrays), columns stay unique and rows still represent single data records. However, data fields don’t contain atomic values anymore but several data (separated by a given symbol). Streaming values are therefore exported as "arrays" within one field.

j2c array compression

7.1.1. Graphical User Interface

The user can choose a JSON input file by clicking the respective button or by simply dropping a file onto the application (the same applies for configuration files). Next to the File Settings, the user is presented the Custom Fields section where attributes of the input file can be searched for, selected, removed, reordered, renamed and tagged as IsoDate or UTC timestamp fields. Further on, additional units can be added (as a second header row of the output CSV file). The Output Settings area offers an user the possibility to choose from predefined characters to define the CSV output file structure (column and decimal separators, line breaks and the option to mask values within quotation marks). The user can furthermore choose between the three methods of converting streaming values (as shown above). For choosing the default Path Creation, the user has to disable Pivoting and Compress Arrays (Pivoting and Compress Arrays are furthermore mutually exclusive). After having selected these options, the user can then export the CSV file or save a preview file with the first 100 data records. Further on, the configuration can also be saved (and used within the Command Line Tool as well). When Path Creation, Pivoting or Compress Arrays is selected, the output CSV file can furthermore be split into multiple CSV files (one CSV file for every JSON dataset).

7.1.2. Command Line Tool

As already stated above, all options of the Graphical User Interface are also available within the command line interface. For working with the command line tool, it is best practice to copy the json2csv-cli-start.cmd file and rename it (e.g. myJson2Csv.cmd). This just created myJson2Csv.cmd file can now be edited (right click and edit) according to the needs of an user (passing arguments or property files for example). Now, a shortcut of the myJson2Csv.cmd file can be created (right click and create a shortcut) and copied to another directory (to the desktop for example). This shortcut can now be used in the following ways, while dropping a JSON input file onto it:

Using the parameters, defined within the customized myJson2Csv.cmd file:
Simply drag a JSON input file onto the myJson2Csv.cmd file. This starts the application, using the parameters of the customized cmd file. The following parameters can be passed to the application:

-of <arg>      original filename, path and extension of the
               SOURCE input .json file (e.g. "\c\...\input.json")
-nf <arg>      new filename and extension of the DEST output
               .csv file (e.g. "\c\...\output.csv")
-conf <arg>    filename, path and extension of the configuration
               file (e.g. "\c\...\myConfig.properties")
-lb <arg>      line break style ("\n", "\r" or "\n\r")
-cs <arg>      column separator (e.g. ";")
-as <arg>      array compression into strings using the given
               separator (e.g. "|")
-os <arg>      old separator for decimals (e.g. ".")
-ns <arg>      new separator for decimals (e.g. ",")
-tm            text masking, if set, text is masked with ["],
               otherwise no masking is applied
-tk            "Timsche Konvertierung", if set, streaming
               values are specially exported as columns
-pv            pivoting, if set, streaming values are exported as columns
-e <arg>       encoding ("UTF-8", "UTF-16" or "ANSI")
-ei <arg>      input file encoding ("UTF-8", "UTF-16" or "ANSI")
-s             split output documents (one single csv document
               for every json dataset)
-h             help information
-v             version information

Using a properties file for executing (more complex) own settings:
Doing so, a properties file must be set up and passed as -conf parameter within the myJson2Csv.cmd file. This properties file simply consists of key value pairs, separated by an equality sign (=). Next to the structural output settings (line breaks, column separators, etc.), user chosen attributes, aliases, additional units, IsoDate or UTC timestamp conversions can be passed (separated by a comma). When command line arguments and a properties file are passed together, the properties file overwrites the settings of the command line parameters. The following example shows the list of possible configuration settings within a sample properties file. It furthermore gives an impression of the syntax of the user chosen attributes:

textMasking=true
newDecimalSeparator=,
oldDecimalSeparator=.
pivoting=false
timscheKonvertierung=false
compressArrays=true
encoding=UTF-8
encodingInput=UTF-8
arraySeparator=|
columnSeparator=;
lineBreak=\n
fileType=.csv
splitting=false
customField0=_id,,false,false,
customField1=payload.Motor.value,Motor,false,false,unit
customField2=metaData.eventStartTs,Start,true,false,

As shown above, the user chosen attributes (full path, separated by a dot), their alias names, additional units and the IsoDate or UTC timestamp conversions are separated by a comma. The key of the user chosen attributes has to be called customField plus an enumeration (e.g. customField0). The enumeration is responsible for the order of the attributes in the output CSV file. The IsoDate and UTC timestamp conversions have to be either set true or false, whereas the alias and the unit can be left empty:

EITHER:    customField0=path.of.the.attribute.and.name,alias-name,true,false,unit
    OR:    customField1=path.of.the.attribute.and.name,,false,true,

It it best practice to use the Graphical User Interface to generate a customized properties file in advance. This properties file can now be used by a personalized myJson2Csv.cmd file.

The following examples show the intention of an user and the customization of three sample myJson2Csv.cmd files:

Example 1:
Convert C:\data\input.json to C:\data\input.csv separated by ; convert decimal point . to comma ,:

java -jar lib\json2csv-cli-<version-number>.jar -of "C:\data\input.json" -cs ";" -os "." -ns ","
pause

Example 2:
Convert C:\data\input.json to C:\data\output.csv separated by ; array compression using | convert decimal point . to comma , text masked with " encoded with UTF-8:

java -jar lib\json2csv-cli-<version-number>.jar -of "C:\data\input.json" -nf "output.csv" -cs ";" -as "|" -os "." -ns "," -tm -e "utf-8"
pause

Example 3:
Convert C:\data\input.json to C:\data\input.csv using the settings of the myConfig.properties file:

java -jar lib\json2csv-cli-<version-number>.jar -of "C:\data\input.json" -conf "C:\data\myConfig.properties"
pause

7.2. Cloud Http Client

The Cloud Http Client can be used to upload Data in a Insights project very simple. To do this the tool expects a defined source and destination directory. When starting the tool, it is checking every file in the source directory (and all sub directories by default) and start an upload process for each file. Every successfully uploaded file is moving in the destination directory. In addition The tool supports also other functions:

  • Using a proxy connection if required.

  • Supporting all Data Recorder Service functions which are described in the Swagger UI.

  • Allows to add additional meta information to the data.

The Cloud Http Client can be used as command line tool or as an application with a graphcial user interface.

7.2.1. Configure the Cloud Http Client

The Cloud Http Client expect a config.properties and metaData.properties file in the project directory. The config.properties contains all configuration options, the metaData.properties can be used to set own additional meta informations. The Graphical User Interface supports the configuration in the tool directly, but if you will used the Command Line Version, the config files must adapt manual. The config.properties contains the following configuration fields:

Required fields

-sourceDir= Must contain a absolute or relative path to the source directory.By default the file
 folder in the tool directory is selected.

-destinationDir= Must contain a absolute or relative path to the destination directory.
 By default the file_after_upload in the tool directory is selected.

 -considerSubdirectories= Define if only files in the source directory should be uplodad
 or also all files in sub directories from the source directory too.
 By default true (Accepted Values: true/false)

 -projectUrl= Must contain the path to the Insights Data Recorder Service.
 By default the path is set to https://www.bosch-sfde.com/data-recorder-service/v2/ and need not changed.

 -projectId= Must contain the ID of the target Insights project.

 -projectUser= Must contain the user name of your Insights account.

 -contentType= Must contain the content type from the files. The default is application/json.

Optional fields This options are usable for additional functionalities of Cloud Http Client. The options not required and can disabled with a # before the option name.

 -proxyUrl= If you need a proxy connection define the proxy url here. By default disabled.

 -proxyPort= If you need a proxy connection define the proxy port here. By default disabled.

 -proxyUser= If you need a proxy connection with authentification define the proxy user here.
 By default disabled.

 -X-Bulk-Mode= Can be used to enable the X-Bulk-Mode option in the sFDE Data Recoder Service.
 By default disabled. (Accepted Values: true/false)

The Cloud Http Client supports specific functions for developers. It isn’t recommend to enable this functions as customer!

 -AllowUnsignedCertificates = By default disabled. This option is only for developer.
 Customers should disable this option because this option is disabling some security features.
  (Accepted Values: true/false)

The tool allows storing passwords as a plaintext in the config file directly, but also this isn’t recommended. If required the tool will ask you for a password without persisting any passwords in plaintext.

 -proxyPassword= can contain your proxy password. By default disabled
 -projectPassword= can contain your project password. By default disabled.

The Cloud Http Client send for each file the file date and the file name in the meta data. Also The Client allows to add additional self defined meta data Information. This informations are sending with each file. All additional meta data information are defined in the metaData.properties as key/value pairs. A example metaData.properties can look like this:

Example 1:

Group=smallCars
ECUVersion=1.2.0

Please observe that the client also read and store incorrect pairs like "=value" or "name=". But the client is sending only correct pairs, all incorrect pairs will be ignored by the client in the sending process. Currently the client send all additional metaData information as a String datatype.

7.2.2. Graphical User Interface

This version from the Cloud Http Client supports the user by the tool configuration with a graphical user interface. A manual adapting from the config files by the user is not required. the functionality is the same as in the command line version. Start the tool by executing the uploadtool-gui-start.* file.

In the Graphical User Interface the adding of additional meta data information is limited to 10. If you will add more key/value pairs. Open the metaData.properties add all pair/value pairs manual, save and open the Graphical User Interface again.

7.2.3. Command Line Interface

Before you can use the Bosch IoT Insights command line tool, the config.properties and metaData.properties must be adapted manual. Then the command line version could be started by executing the uploadtool-cmd-start.* file. The tool will lead through the upload process.


8. Terminology

8.1. Data Vocabulary

Data
  • lowest level of abstraction from which information and then knowledge is derived

  • sequence of one or more symbols, given meaning by specific acts of interpretation

  • examples: sensor value, counter, histogram

Field Data
  • data collected in an uncontrolled in situ environment

  • data collected by using a product in a realistic environment

  • examples: pre- or post-SOP data collection with production vehicles, etc.

Context Data
  • essential vehicle or infrastructure characteristics

  • data retrieved from outside the object under inspection (e.g. products)

  • examples: VIN, OEM, model, odometer, etc.

Meta data
  • descriptive, structural or administrative documentation about data ("data about data")

  • piece of information, necessary to use or properly interpret actual data

  • examples: information about data origin, quality, organisation, implementation, etc.

9. FAQs

How can I get to my data?

The easiest way to view the data is within the Data Browser or with the help of Project Specific Pages (such as dashboards for example). Further on, data can be received by using the Data Analyzer or by requesting the RESTful MongoDB Query Service. Please regard that MongoDB queries for Bosch IoT Insights must comply with the extended JSON format.

How can I download my data?

The Data Analyzer allows the result of the executed query to be downloaded as JSON or CSV (Download button next to the Run button). Moreover, when requesting the RESTful MongoDB Query Service, the response can be processed and used further (and can therefore be stored on a local storage as well).

How can I upload my data?

Data can be sent to the Bosch IoT Insights backend using the HTTP Data Recorder Service interface. As described, authentication has to be given, as well as the selection of a project-specific collection.

How can I send own aggregation queries against my collections?

When the data shall not only be browsed but rather be selected in a more complex way, own MongoDB aggregation queries can be sent against the project-specific collections. This can be done by utilizing the Data Analyzer or by accessing the MongoDB Query Service of the backend using HTTP REST requests. Please regard that MongoDB queries for Bosch IoT Insights must comply with the extended JSON format.

Why doesn’t Insights accept my MongoDB queries which work fine from within other applications?

Since all MongoDB queries from within Insights (UI and REST API) require to be written with extended JSON, this chapter explains further details and gives examples of valid and invalid queries. It furthermore exemplifies how to convert queries to comply with the extended JSON format.

How can I retrieve my old queries?

Within the user interface, the Query History stores and lists all old queries of the user. Further on, they can be filtered by selecting a tag name. The MongoDB Query Service also offers the possibility of requesting all queries from the authenticated user.

How can I see the data metrics of my project?

Within the Data Analyzer, statistical information of the project-specific collections is displayed next to the query editor.

How can I access my data from within Matlab?

Newer Matlab versions already offer the possibility of requesting RESTful services directly. However, additional extensions are required when working with older versions. Matlab can then call the MongoDB Query Service and process the received data locally afterwards (a detailed request example can be found here).

How can I get my JSON data in a CSV and Excel-ready format?

The JSON data of the collections can either be downloaded as CSV within the Data Analyzer or it can be converted to CSV with the help of the offline Json2Csv Converter.

How do I authenticate when requesting Insights with a REST client?

There are basically two authentication options when requesting a RESTful endpoint of the Bosch IoT Insights backend: The typical HTTP authentication parameter or an api key parameter that can be added to the header of the request (details of these methods can be found here).

When using the Swagger-UI, I get a 403 status code and a message saying something about CSRF tokens.

We protect you against CSRF and therefore you need to authenticate for some API operations. See this chapter which also describes how to log in over the Swagger-UI.

My HTTP-Client is complaining that it can’t find the given host address.

Maybe you are behind a proxy server. Please see this chapter for details.

Which resources does my REST client need to request?

On one hand, when data shall be ingested, the endpoint of the HTTP Data Recorder Service needs to be requested. On the other hand, when data shall be accessed, the REST resource of the MongoDB Query Service has to be addressed.

Why can’t I paste code snippets from the clipboard into textfields while using Internet Explorer?

Since the Internet Explorer blocks access to the clipboard by default, this access has to be granted manually. Doing so, no administration rights are required (although a warning text within the security settings indicates this). The following screenshots show how to enable access to the clipboard within the security settings ([Internet Options] > [Security] > [Internet] > [Custom Level] > [Scripting] > enable [Allow Programmatic Clipboard Access]). Further information can be found here.

ie settings 01
ie settings 02