Latest changes as RSS feed.
An authentication token is necessary to use the emnify REST API. Historically, you could retrieve this token in two ways: with your user credentials (username and password) or an application token.
In an effort to improve security across our services, authentication with user credentials has now been deprecated.
What is changing?
If you currently use user credentials to authenticate the BICS REST API, you should update your workflow to authenticate with application tokens instead.
Why is BICS making this change?
Using account credentials like passwords outside BICS’s secure login screens creates security risks. Application tokens allow you to generate unique tokens for each application that can be revoked if needed. That way, if your information is compromised, you only need to update the affected token for that application rather than changing your username or password.
Which methods will be restricted?
All requests using authentication with user credentials will return an error after the deprecation period.
New event types added for different kind of scenarios like CloudConnect, OpenVpn, Sim and quota management
Example:
[
{
"timestamp": {},
"alert": true,
"description": "Data quota enabled.",
"id": 69535,
"event_type": {
"description": "Data quota enabled",
"id": 52
},
"event_source": {
"name": "API",
"id": 2
},
"event_severity": {
"description": "INFO",
"id": 0
},
"organisation": {
"name": "Organisation_Name",
"id": 2
},
"endpoint": {
"name": "Monitoring201",
"tags": "Monitoring",
"ip_address": "10.199.6.39",
"imei": null,
"id": 1
},
"sim": {
"iccid": 8988317000000000000,
"production_date": {},
"id": 110
},
"imsi": {
"imsi": 901430000000114,
"import_date": {},
"id": 110
}
}
]
Datadog integration has been added to the new data streamer system. Now you can specify and change your Datadog region (US, US3, EU, US1FED).
Example:
{
"data_stream_type": {
"id": 2
},
"destination": {
"connection_type": "Datadog",
"credentials": {
"api_key": "0123456789abcdef0123456789abcdef01234567",
"region": "US"
},
"format": "Json"
}
}
Data streamer configuration - descriptive name
field
Data streamer now supports specifying optional descriptive name to help to distinguish similar data stream configurations. Name can be set when creating and updating the configuration.
Example:
{
"data_stream_type": {
"id": 1
},
"name": "Production Kinesis stream",
"destination": {
"connection_type": "AwsKinesis",
"credentials": {
"region": "af-south-1",
"stream_name": "demo-stream",
"role_arn": "arn:aws:iam:1234567890:role/role_for_kinesis_data_stream"
},
"format": "Json"
}
}
An improved version of the Data Streamer is available. The update includes format and behaviour changes to help improve performance, stability and usability. In order to ensure a secure and seamless transition, the streams running on the old system are seamlessly migrated to the new system. Migrated streams are matching the previous format and your implementations will work like expected. Nevertheless, new features will be available only with the new system using the latest format, therefore we strongly recommend users to upgrade to the latest version by creating a new stream.
You can check if you are using a deprecated or migrated stream on the portal, the state gets indicated within the connection type name:
- “Deprecated” - Your stream gets provided by the old system and will be migrated soon.
- “Deprecated Format” - Your stream got migrated to the new system and you are using the deprecated format.
Streams on the new system providing the latest features are considered as the default state and therefore are not marked in any way.
Below is an overview of the differences between deprecated streams of the old system (“Deprecated”), the migrated to the new system (“Deprecated Format”) and the latest version.
Note: Most differences are shared between all destinations, destination-specific differences will be stated explicitly.
General Message Format Changes
This section is relevant for most destinations and for Event and Usage Data.
Timestamp Format
The format of timestamps e.g. for event->timestamp, sim->production_date, imsi->import_date.
- Old streamer & migrated streamer (human readable with whitespace without time zone indication using UTC timezone):
"timestamp": "2020-11-10 08:59:19"
- New streamer (ISO 8601 format using UTC timezone):
"timestamp": "2020-11-10T08:59:19Z"
Boolean Format
The handling of boolean values (e.g. alert
field of Event Data)
- Old streamer & migrated streamer (integer 0/1):
"alert": 0
- New streamer (boolean true/false):
"alert": false
Null Values
Some values may be null in Events Data records. This is how they are handled by different Data Streamer versions.
- Old streamer does not provide null root entities.
"endpoint": { "imei": "1234567890123456", "tags": null, ... }, "description": "CloudConnect Transit Gateway breakout 132 is available",
- Migrated streamer & new streamer provides also null root entities:
"endpoint": { "imei": "1234567890123456", "tags": null, ... }, "description": "CloudConnect Transit Gateway breakout 132 is available", "sim": null, "user": null,
Field Name in event_source
The field of event_source has been renamed for consistency in the new system, but the old schema is kept for migrated streams.
- Old streamer & migrated streamer:
"event_source": { "id": 0, "name": "Network" }
- New streamer:
"event_source": { "id": 0, "description": "Network" }
Field Name in traffic_type
The field of traffic_type has been renamed for consistency on the new system, but the old schema is kept for migrated streams.
- Old streamer & migrated streamer:
"traffic_type": { "id": 5, "name": "Data" }
- New streamer:
"traffic_type": { "id": 5, "description": "Data" }
Character Encoding
Old system uses ASCII encoding and e.g. currency symbol is not shown correctly. New and migrated streams provide UTF-8 encoded data.
- Old streamer:
"currency": { "id": 1, "symbol": " ", "code": "EUR" }
- Migrated streamer & new streamer:
"currency": { "id": 1, "symbol": "€", "code": "EUR" }
Amazon Kinesis-Specific Changes
This section lists differences that are specific to Kinesis streams, apart from the already discussed changes in message format.
Error Behaviour
- Old streamer
- does not indicate any errors if something is misconfigured
- stops working if a network error occurs
- Migrated streamer & new streamer
- has retry mechanism based on KPL
- retrys transmiting data for up to 60 seconds, after which the stream will fail
- fails fast if first message cannot be delivered at all
- fails after some retries if destination stream becomes unavailable after previously successfull transmissions (e.g. deleted destination down)
- if an error occurs, the stream is switched to error state and transmitting stops (-> in that case, please fix configuration or destination and restart the stream)
Webhook / RestAPI-Specific Changes
This section lists specific changes to RestAPI which has been renamed to Webhook in the new system.
Headers
Migrated streamer & new streamer set a different User-Agent and two additional headers.
- Old streamer
user-agent: TLD
- Migrated streamer & new streamer
user-agent: DataStreamer/2.0 accept-encoding: gzip, deflate accept: application/json
Error Behaviour
- Old streamer fails on an undelivered message and indicates the HTTP Result Code. It does not respect redirects (HTTP Result Code 3xx) and retries constantly.
- Migrated streamer & new streamer provide an improved error behaviour:
- Has retry policy with exponential backoff
- Fails fast if first message cannot be delivered successfully
- Failed data stream is put in error state and shows HTTP Result Code description or error message. In that case, please fix configuration or destination and restart the stream.
KeenIO-Specific Changes
This section lists specific changes to KeenIO data stream.
Setup of Data Stream
Old streamer delivers to fixed destination streams. With the new streamer, the ability to define the KeenIO collection names got introduced.
Migrated streamer will remain delivering data to the previous collection names defined for the old streamer.
Event Message Format
- Old streamer & migrated streamer use a reduced event schema in comparison with RestAPI (e.g. does not deliver “imsi”). Migrated streamer contains additionally null root values.
"id": 1234, "alert": 0, "description": "PDP Context deleted.", "detail": { ... }, "endpoint": { "id": 1234, "name": "Test" }, "event_severity": { ... }, "event_source": { "id": 0, "name": "Network" }, "event_type": { ... }, "organisation": { ... }, "sim": { "id": 1234, "iccid": "8988303000000123456" }, "timestamp": "2020-11-10 08:59:19"
- New streamer delivers the same schema as Webhook (therefore e.g. imsi, sim and endpoint have more information).
"id": 1234, "alert": false, "description": "PDP Context deleted.", "detail": { ... }, "endpoint": { "id": 1234, "imei": "1234", // additional field "ip_address": "10.10.10.10", // additional field "name": "Test", "tags": null // additional field }, "event_severity": { ... }, "event_source": { "id": 0, "description": "Network" }, "event_type": { ... }, "imsi": { // additional field "id": 1234, // additional field "import_date": "2017-05-03T20:36:28Z", // additional field "imsi": "123456789012345" // additional field }, // additional field "organisation": { ... }, "sim": { "iccid": "8988303000000123456", "id": 1234, "msisdn": "123456789012345", // additional field "production_date": "2017-05-03T20:36:28Z" // additional field }, "timestamp": "2020-11-10T08:59:19Z", "user": null
Message Format Usage Data
- Old streamer & migrated streamer delivers reduced schema in comparison with RestAPI. Migrated streamer contains additionally null root values.
"cost": 0.0015, "id": 1618839102882, "operator": { ... }, "organisation": { ... }, "tariff": { ... }, "traffic_type": { "id": 5, "name": "Data" }, "endpoint": { "id": 1234, "balance": { // old streamer: balance not present when not set "amount": 15.0, "last_updated": "2020-05-29 15:38:36", "expiry_date": "2025-04-06 09:00:00", // old streamer: expiry_date not existing when not set "currency": { ... } } }, "volume": { ... }, "start_timestamp": "2021-04-19 13:31:42", "keen": { // old streamer: additional keen.timestamp that is the same as start_timestamp "timestamp": "2021-04-19 13:31:42" // migrated / new streamer: not present, but automatically set by KeenIO upon receival }, "sim": { "id": 1234, "iccid": "8988303000000123456" }, "currency": { ... }, "end_timestamp": "2021-04-19 13:31:42"
- New Data Streamer delivers the same schema as Webhook
"cost": 0.14776475, "currency": { ... }, "end_timestamp": "2021-02-22T13:05:45Z", "endpoint": { "id": 1234, "imei": "1234567890123456", // additional field "ip_address": "10.10.10.10", // additional field "name": "Test", "balance": { "amount": 15.0, "last_updated": "2020-05-29T15:38:36Z", "expiry_date": "2025-04-06T09:00:00Z", "currency": { ... } } }, "id": 1234, "imsi": "123456789012345", "imsi_id": 1234, "operator": { ... }, "organisation": { ... }, "sim": { "iccid": "8988303000000123456", "id": 1234, "msisdn": "123456789012345", // additional field "production_date": "2017-05-03T20:36:28Z" // additional field }, "start_timestamp": "2021-02-22T13:04:41Z", "tariff": { ... }, "traffic_type": { "id": 5, "description": "Data" }, "volume": { ... }
Error Behaviour
- Old streamer always shows Status 200 even if the data is not delivered successfully, e.g. because of wrong/invalid key. Status 500 is only shown if there is no response / no connection from KeenIO server.
- Migrated streamer & new streamer applies the same retry strategy as Webhook and displays the returned error code.
Amazon S3-Specific Changes
Message Format Event Data
-
Old & migrated streamer CSV header:
"id","timestamp","event_type_id","event_type_description","event_severity_id","event_severity_description","organisation_id","organisation_name","description","alert","event_source_id","event_source_name","endpoint_id","endpoint_name","endpoint_imei","endpoint_ip_address","endpoint_tags","sim_id","sim_iccid","msisdn_msisdn","sim_production_date","imsi_id","imsi_imsi","user_id","user_username","user_name"
-
New streamer CSV header (includes more fields):
"id","timestamp","event_source_id","event_source_description","event_severity_id","event_severity_description","event_type_id","event_type_description","organisation_id","organisation_name","user_id","user_username","user_name","alert","description","endpoint_id","endpoint_name","endpoint_ip_address","endpoint_tags","endpoint_imei","sim_id","sim_iccid","sim_msisdn","sim_production_date","imsi_id","imsi_imsi","imsi_import_date","detail"
Message Format Usage Data
-
Old and migrated streamer CSV header:
"id","event_start_timestamp","event_stop_timestamp","organisation_id","organisation_name","endpoint_id","sim_id","iccid","imsi","operator_id","operator_name","country_id","operator_country_name","traffic_type_id","traffic_type_description","volume","volume_tx","volume_rx","cost","currency_id","currency_code","currency_symbol","ratezone_tariff_id","ratezone_tariff_name","ratezone_id","ratezone_name","endpoint_name","endpoint_ip_address","endpoint_tags","endpoint_imei","msisdn_msisdn","sim_production_date","operator_mncs","country_mcc"
-
New streamer CSV header (includes more fields):
"id","start_timestamp","end_timestamp","cost","volume_total","volume_rx","volume_tx","operator_id","operator_name","operator_mnc","operator_country_id","operator_country_mcc","operator_country_name","organisation_id","organisation_name","tariff_id","tariff_name","tariff_ratezone_id","tariff_ratezone_name","traffic_type_id","traffic_type_description","endpoint_id","endpoint_name","endpoint_ip_address","endpoint_tags","endpoint_imei","endpoint_balance_amount","endpoint_balance_last_updated","endpoint_balance_expiry_date","endpoint_balance_currency_id","endpoint_balance_currency_code","endpoint_balance_currency_symbol","imsi","imsi_id","sim_id","sim_iccid","sim_msisdn","sim_production_date","currency_id","currency_code","currency_symbol"
Documentation Improvements:
Documentation on Filtering and Sorting at /event endpoints has been updated and notes to new behaviour due to a new software stack have been added.
Note: This change only applies to your instance if the new software stack is in place, otherwise there will be no change in documentation and behaviour.
Topic | Method | Entrypoint | Description |
---|---|---|---|
Endpoint | GET | /api/v1/endpoint/{endpoint_id}/event | Added detailed information on filtering, sorting and new behaviour |
SIMs | GET | /api/v1/sim/{sim_id}/event | Added detailed information on filtering, sorting and new behaviour |
User Management | GET | /api/v1/user/{user_id}/event | Added detailed information on filtering, sorting and new behaviour |
Events | GET | /api/v1/event | Added detailed information on filtering, sorting and new behaviour |
Organisation | GET | /api/v1/organisation/{org_id}/event | Added detailed information on filtering, sorting and new behaviour |
Detailed information on filtering on these endpoints has been added to Collections & Pagination as well.
Features:
The Swagger Reference now persists authentication state across page reload and closing / reopening the browser. Users who sign in via the authenticate API and store an auth token in the authorize field will remain logged-in for 240 minutes.
Fixes:
The Create Endpoint API has been fixed in reference to the status property which determines whether it should be active on creation. Also improved in this API are the provided examples which show a basic example with mandatory fields only and an additional example with optional fields such as IMEI lock.
Topic | Method | Entrypoint | Description |
---|---|---|---|
Endpoints | POST | /api/v1/endpoint | Improved examples and fixed endpoint status properties |
Additions:
Users can now specify DNS configuration via API. This functionality allows users to specify primary
and secondary
nameservers for the DNS resolution of endpoint networking. DNS changes are instantly applied to any new PDP context; already connected devices with established PDPs will continue to use the previous nameserver config until the next time they reconnect.
Topic | Method | Entrypoint | Description |
---|---|---|---|
Service Lookups and Configuration | GET | /api/v1/dns | Returns of a list of DNS configuation objects |
Service Lookups and Configuration | POST | /api/v1/dns | Create a new DNS configuration object |
Service Lookups and Configuration | DELETE | /api/v1/dns/{dns_id} | Delete a DNS configuration by ID |
Documentation Improvements:
DNS configs are added to Service Profiles and all endpoints which use this Service Profile will use the specified DNS settings.
Topic | Method | Entrypoint | Description |
---|---|---|---|
Service Profiles | POST | /api/v1/service_profile | Added examples for creating Service Profiles with DNS config |
Service Profiles | PATCH | /api/v1/service_profile/{service_profile_id} | Added examples for updating and removing DNS configs of Service Profiles |
This release introduces the documentation for the following entrypoint:
Topic | Method | Entrypoint | Description |
---|---|---|---|
Tariff Plans | POST | /organisation/{organisation_id}/tariff_plan | Assign Tariff Plan to Organisation |
Inclusive volume is counted per device on a monthly basis, the data can be pooled or be treated per individual device.
Example: 1 MB
inclusive volume at €1
for an organisation with 5 active SIMs will result in 5 MB inclusive volume at a cost of €5 for the billing month.
- If the inclusive volume is pooled, all 5 SIMs can consume from a pool of
5 MB
. If one SIM consumes4 MB
and another consumes1 MB
, the inclusive volume has been depleted and further data consumption on any SIM will be billed according to excess traffic rates. - For non-pooled inclusive volume, all five SIMs will have
1 MB
inclusive volume each. Consuming more than1 MB
per device will be billed according to excess traffic rates.
Topic | Method | Entrypoint | Description |
---|---|---|---|
Organisation | GET | /organisation/{org_id_or_my}/inclusive_volume/active | Returns of a list of inclusive volumes that are currently active in the current billing period for the selected organisation |
AWS Trust Relationship
For AWS integrations, authentication is performed by means of a role on the AWS side with write permissions to the desired resources. A trust relationship is added for the BICS datastream which allows for securely delivering payload data.
POST /api/v1/data_stream
:
S3 integration request body:
{
"stream_historic_data": 0,
"data_stream_type": {
"description": "Usage Data & Events",
"id": 3
},
"api_type": {
"description": "AWS S3",
"id": 8
},
"api_username": "arn:aws:iam::<your-account_id>:role/<bucket-role-name>",
"api_parameter": "eu-west-1/bucket-name"
}
Kinesis integration request body:
{
"stream_historic_data": 0,
"data_stream_type": {
"description": "Usage Data & Events",
"id": 3
},
"api_type": {
"description": "AWS Kinesis",
"id": 4
},
"api_username": "arn:aws:iam::<your-account_id>:role/<kinesis-role-name>",
"api_parameter": "eu-west-1/<kinesis-stream-name>"
}
For details on the setup steps on the AWS side, refer to the Datastreamer AWS Integration page.
Topic | Method | Entrypoint | Description |
---|---|---|---|
Integration | POST | /data_stream | AWS Trust Relationship Changes |
Endpoint Quota Changes
The following endpoints for managing endpoint quota have been updated according to the current validation. Validation on the action_on_exhaustion
property is now less restrictive in relation to contained element. The system will not generate errors if throttling
is selected ("id": 2
) for SMS quotas. For example, the following call is allowed but will not have any effect on SMS usage when the quota is exhausted:
POST /api/v1/endpoint/{endpoint_id}/quota/sms
:
Request body:
{
...
"action_on_exhaustion": {
"id": 2
},
...
}
Topic | Method | Entrypoint | Description |
---|---|---|---|
Endpoint | POST | /endpoint/{endpoint_id}/quota/data | Set Data Quota |
Endpoint | GET | /endpoint/{endpoint_id}/quota/data | Retrieve Data Quota details |
Endpoint | POST | /endpoint/{endpoint_id}/quota/sms | Set SMS Quota |
Endpoint | GET | /endpoint/{endpoint_id}/quota/sms | Show SMS Quota details |
Daily Statistics Fixes
Version 1.2.21 of the specification fixes the query parameters of the daily statistics entrypoints:
Topic | Method | Entrypoint | Description |
---|---|---|---|
Endpoint | GET | /endpoint/{endpoint_id}/stats/daily | Endpoint Usage and Costs Statistics per day |
SIMs | GET | /sim/{sim_id}/stats/daily | SIM Usage and Costs Statistics per day |
Organisation | GET | /organisation/{org_id}/stats/daily | Organisation Usage and Costs Statistics per day |
With this update, the data streamer now includes more comprehensive Usage and Event data to the destination buckets, Kinesis streams and user-provided applications.
Additional S3 Data (CSV)
Note:
Users of the S3 integration should ensure that minimum security standards are enforced. Details of configuring buckets for this purpose are described in Security Guidelines.
Event Data
The following additional properties are now included in event stream CSV files as column headers (where applicable):
endpoint_id
endpoint_name
endpoint_imei
endpoint_ip_address
endpoint_tags
event_source_id
event_source_name
sim_id
sim_iccid
msisdn_msisdn
sim_production_date
imsi_id
imsi_imsi
user_id
user_username
user_name
Usage Data
The following additional properties are now included in usage stream CSV files as column headers (where applicable):
endpoint_name
endpoint_ip_address
endpoint_tags
endpoint_imei
msisdn_msisdn
sim_production_date
operator_mncs
country_mcc
Additional Kinesis Data (JSON)
Kinesis Event Data
The following additional properties are now included in event stream JSON for endpoint
, sim
and imsi
objects:
{
"endpoint" : {
"name": "...",
"imei": "...",
"ip_address": "...",
"tags": "..."
},
"sim" : {
"msisdn": "...",
"production_date": "..."
},
"imsi" : {
"imsi": "...",
"import_date": "..."
}
}
Kinesis Usage Data
The following additional properties are now included in event stream JSON for endpoint
, sim
and operator
objects:
{
"endpoint" : {
"name": "...",
"ip_address": "...",
"tags": "...",
"imei": "...",
},
"sim": {
"msisdn": "...",
"production_date": "..."
},
"operator": {
"mnc": "...",
"country": {
"mcc" : "..."
}
}
}
Additional Rest API Data (JSON)
The data streamer may also send usage and event data in JSON format toward a configurable URL specified by the user. In this case, users provide an application which consumes HTTP POST
requests sent from the BICS platform. With this deployment, both Rest API stream types (Rest and Bulk) will include the additional data as specified in Kinesis Event Data and Kinesis Usage Data.
Security Guidelines
Event data that is sent via Data Streams may include usernames, email addresses and other data which can identify users or platform resources. The generated .csv
files should therefore be treated as containing sensitive information. Precautions should be taken to ensure that the event and usage data in the destination S3 buckets are adequately secured.
The following three steps should be considered as the minimum security requirements for storing such data in S3:
- Ensure that the S3 bucket is not publicly accessible. This can be applied in the Permissions tab of the S3 bucket.
- Server-Side Encryption can be enabled per-bucket and S3 will encrypt objects before they are saved to disk. The decryption is then performed when downloading the objects. This can be enabled in the Properties tab of the S3 bucket.
- The IAM user whose credentials (key ID & key secret) access the S3 bucket should have their permissions restricted to writing to the required bucket only. This can be done in the AWS console in Users -> {Data Stream User} -> Permissions -> Add Permissions -> Attach Policy Directly -> Create Policy. An example JSON Policy would look like:
{
"Id": "Policy1569854716005",
"Version": "2012-10-17",
"Statement": [
{
"Sid": "Stmt1569854710254",
"Action": [
"s3:PutObject"
],
"Effect": "Allow",
"Resource": "arn:aws:s3:::my-data-stream", // <1>
"Principal": {
"AWS": [
"data-stream-writer" // <2>
]
}
}
]
}
<1> ARN of the destination S3 Bucket
<2> The IAM user with data stream write access
TIP
AWS provide an online JSON Policy Generator which can be used to create a policy like the example given above.
Backwards Compatibility
The BICS Data Streamer is under active development and is updated for performance and quality improvements regularly so that users of the platform may gain rich streaming insights into usage and event data.
Versioning
There is no external versioning of the Data Streamer that is necessary for developers to have to track. Updates are, therefore, always performed on the service with the intent of preserving backwards compatibility. This means that existing JSON or CSV entities have their ordering preserved and are not renamed or replaced when updates to the data streamer are performed.
Parsing S3 or Kinesis Artifacts
Users who have built custom integrations in AWS or otherwise which consume the JSON or CSV generated from S3 or Kinesis streams should expect that additional JSON or CSV data may be added in future. Mature and tested libraries designed for parsing or reading CSV and JSON data should be used for custom integrations (which may be in lambdas, for example) to ensure compatibility in cases where additional properties or objects are added to data streams in future.
Endpoint Quota Changes
The following endpoints for managing endpoint quota have been improved.
Topic | Method | Entrypoint | Description |
---|---|---|---|
Endpoint | POST | /endpoint/{endpoint_id}/quota/data | Set Data Quota |
Endpoint | GET | /endpoint/{endpoint_id}/quota/data | Retrieve Data Quota details |
Endpoint | POST | /endpoint/{endpoint_id}/quota/sms | Set SMS Quota |
Endpoint | GET | /endpoint/{endpoint_id}/quota/sms | Show SMS Quota details |
Traffic Limit Changes
Method | Entrypoint | Status |
---|---|---|
POST | /traffic_limit | Removed |
Traffic Limit POST Details
The POST
method for /traffic_limit
has been marked deprecated and is now removed from the documentation.
Traffic Limits may still be applied by adding existing traffic limits to a service profile with the following operation:
PUT
/api/v1/service_profile/{profile}/service/{service}/traffic_limit/{traffic_limit}
For more information on adding traffic limits to services, see the Swagger Reference for this entrypoint.
API Rate Limiting Details
The supplementary documentation adds details on newly-applied logic for API rate limiting.
The API has rate limiting enforced at a value of 2000 requests per IP in a five minute window.
When this threshold is reached, 403 Forbidden
will be returned by the system as a response to excessive requests until the average request rate is within the specified limit.
This is documented and may be easily referenced in the Rate Limits page.
RAT Type Changes
A new parameter has been added to the following RAT Type endpoint:
Topic | Method | Entrypoint | Description |
---|---|---|---|
RAT Type | GET | /rat_type | Returns a list of supported RAT types |
QoS Definitions
A new API for defining QoS Bandwidth limits per PDP context definition and RAT Type has been added
Topic | Method | Entrypoint | Description |
---|---|---|---|
PDP Context Definitions | POST | /pdp_context_definition/{pdp_id}/qos_definition | Add a new QoS Limit |
PDP Context Definitions | GET | /pdp_context_definition/{pdp_id}/qos_definition | Get the limits for the given PDP context definition and RAT Type |
PDP Context Definitions | PATCH | /pdp_context_definition/{pdp_id}/qos_definition | Modifies the limits for the given PDP Context and RAT Type |
PDP Context Definitions | DELETE | /pdp_context_definition/{pdp_id}/qos_definition | Remove the limits |
Furthermore, the PDP Context get definition call has been modified to include the qos definitions as well
Topic | Method | Entrypoint | Description |
---|---|---|---|
PDP Context Definitions | GET | /pdp_context_definition | Get the list of pdp context definitions |
SMS Routing Changes
The following endpoints for configuration of SMS Routing entries have been updated:
Topic | Method | Entrypoint | Description |
---|---|---|---|
SMS Routing | GET | /sms_routing | Get list of SMS routing entries |
SMS Routing | GET | /sms_routing/{sms_routing_id}/data | Get list of SMS routing data |
SMS Routing | POST | /sms_routing/{sms_routing_id}/data | Create SMS routing data |
SMS Routing | PATCH | /sms_routing/{sms_routing_id}/data/{sms_routing_data_id} | Update SMS routing data |
Data Streamer
The following entrypoint to delete Data Streams has been added:
Topic | Method | Entrypoint | Description |
---|---|---|---|
Data Streamer | DELETE | /data_stream/{data_stream_id} | Delete a Data Stream by ID |
Data Streamer (1.2.14)
The following entrypoint to list and manage Data Streams has been added:
Topic | Method | Entrypoint | Description |
---|---|---|---|
Data Streamer | GET | /data_stream | List Data Streams |
Data Streamer | POST | /data_stream | Create Data Stream |
Additional documentation pages have been added for the following Data Streamer integrations:
RAT Type Additions
The following endpoint to retrieve a list of RAT types has been added:
Topic | Method | Entrypoint | Description |
---|---|---|---|
RAT Type | GET | /rat_type | List RAT types |
Network Coverage Data RAT Type Blacklist Additions
The following endpoints have been added for blacklisting RAT Types on network coverage data:
Topic | Method | Entrypoint | Description |
---|---|---|---|
Network Coverage | PUT | /network_coverage/{coverage_id}/coverage/{operator_id}/rat_type/{rat_type_id} | Blacklist RAT type |
Network Coverage | DELETE | /network_coverage/{coverage_id}/coverage/{operator_id}/rat_type/{rat_type_id} | Remove a RAT type from the blacklist |
Network Coverage Data Additions
The following endpoint has been updated to show a list of blacklisted RAT types on a network coverage data entry:
Topic | Method | Entrypoint | Description |
---|---|---|---|
Network Coverage | GET | /network_coverage/{coverage_id}/coverage | Retrieve Network Coverage Data |
SMS Routing Additions
The following endpoints for configuration of SMS Routing entries have been added:
Topic | Method | Entrypoint | Description |
---|---|---|---|
SMS Routing | GET | /sms_routing | Get list of SMS routing entries |
SMS Routing | POST | /sms_routing | Create a new SMS routing entry |
SMS Routing | GET | /sms_routing/{sms_routing_id} | Fetch SMS routing entry by ID |
SMS Routing | PATCH | /sms_routing/{sms_routing_id} | Update an SMS routing entry |
SMS Routing | DELETE | /sms_routing/{sms_routing_id} | Delete an SMS routing entry |
SMS Routing | GET | /sms_routing/{sms_routing_id}/data | Get list of SMS routing data |
SMS Routing | POST | /sms_routing/{sms_routing_id}/data | Create SMS routing data |
SMS Routing | PATCH | /sms_routing/{sms_routing_id}/data/{sms_routing_data_id} | Update SMS routing data |
SMS Routing | DELETE | /sms_routing/{sms_routing_id}/data/{sms_routing_data_id} | Delete SMS routing data |
PDP Context Definitions Additions
The following endpoints for configuration of PDP Context Definitions have been improved to explain the request parameters better:
Topic | Method | Entrypoint | Description |
---|---|---|---|
PDP Context Definitions | POST | /pdp_context_definition | Create PDP Context Definition |
PDP Context Definitions | PATCH | /pdp_context_definition/{definition_id} | Update a PDP Context Definition |
Updates
The following entrypoints for endpoint and organisation prepaid balance have been updated:
Topic | Method | Entrypoint | Description |
---|---|---|---|
Endpoint | GET | /endpoint/{endpoint_id}/balance | Updated Endpoint balance description |
Prepaid Balance | GET | /organisation/{org_id}/balance | Updated prepaid balance of given organisation |
Additions
The following endpoints for retrieval of statistics have been added:
Topic | Method | Entrypoint | Description |
---|---|---|---|
Statistics | GET | /endpoint/{endpoint_id}/stats/daily | Endpoint Usage and Costs Statistics per day |
Statistics | GET | /stats/daily | Organisation Usage and Costs Statistics per day for the current month |
Statistics | GET | /organisation/{org_id}/stats/daily | Organisation Usage and Costs Statistics per day |
Statistics | GET | /sim/{id}/stats | SIM Usage and Costs Statistics |
Statistics | GET | /sim/{id}/stats/daily | SIM Usage and Costs Statistics per day |
In 1.2.9, the following MNO-only operations have been added:
Network Coverage Additions
Topic | Method | Entrypoint | Description |
---|---|---|---|
Network Coverage | GET | /network_coverage | Retrieve Network Coverage Collection |
Network Coverage | GET | /network_coverage/{coverage_id}/coverage | Retrieve Network Coverage Data |
Network Coverage | PUT | /network_coverage/{coverage_id}/coverage/{operator_id} | Add Operator to Network Coverage |
Network Coverage | PATCH | /network_coverage/{coverage_id}/coverage/{operator_id} | Update VPLMN Status in Network Coverage Configuration |
Operator Additions
Topic | Method | Entrypoint | Description |
---|---|---|---|
Operator | GET | /operator | List Operators |
Operator | POST | /operator | Add an Operator |
Operator | PATCH | /operator/{operator_id} | Update an Operator |
Operator | DELETE | /operator/{operator_id} | Delete an Operator |
Operator | GET | /operator/{operator_id}/{operator_data} | Retrieve Related Data of an Operator |
Operator | POST | /operator/{operator_id}/{operator_data} | Update Related Data of an Operator |
Operator | DELETE | /operator/{operator_id}/{operator_data}/{op_data_id} | Delete Related Data of an Operator |
PDP Context Definitions
Topic | Method | Entrypoint | Description |
---|---|---|---|
PDP Context Definitions | GET | /pdp_context_definition | Retrieve Collection of PDP context definitions |
PDP Context Definitions | POST | /pdp_context_definition | Create PDP Context Definition |
PDP Context Definitions | PATCH | /pdp_context_definition/{definition_id} | Update a PDP Context definition |
PDP Context Definitions | DELETE | /pdp_context_definition/{definition_id} | Delete a PDP Context definition |
Tariffs | PUT | /tariff/{tariff_id}/pdp_context_definition/{definition_id} | Assign PDP Context Definition to Tariff |
Tariffs | DELETE | /tariff/{tariff_id}/pdp_context_definition/{definition_id} | Unassign PDP Context Definition from Tariff |
Inter Operator Tariff
Topic | Method | Entrypoint | Description |
---|---|---|---|
Inter Operator Tariff | GET | /iot | List Inter Operator Tariffs |
Inter Operator Tariff | POST | /iot | Add new Inter Operator Tariff rate |
Inter Operator Tariff | PATCH | /iot/{id} | Get Inter Operator Tariff rate by ID |
Tariff Profile Custom Rates
Topic | Method | Entrypoint | Description |
---|---|---|---|
Tariff Profile | GET | /tariff_profile/{id} | List Tariff Profile Details by ID |
Tariff Profiles now contain applied custom rates for included tariffs where they exist.
Version 1.2.7 Additions
The following API requests have been added
Topic | Method | Entrypoint | Description |
---|---|---|---|
Billing | GET | /organisation/{org_id}/billing | Retrieve a list of available billing periods (by default the calendar month) for which billing data is available. This will also include the current billing period. |
Billing | GET | /organisation/{org_id}/billing/{billing_id} | Retrieve a list of available billing periods for an organisation. |
Quota Management | GET | /endpoint/{endpoint_id}/quota/data | Retrieve the current data quota status for an endpoint |
Quota Management | POST | /endpoint/{endpoint_id}/quota/data | A new data quota can be set for an endpoint |
Quota Management | GET | /endpoint/{endpoint_id}/quota/sms | Retrieve the current SMS quota status for an endpoint |
Quota Management | POST | /endpoint/{endpoint_id}/quota/sms | A new SMS quota can be set for an endpoint |
Prepaid Balance Management | GET | /endpoint/{endpoint_id}/balance | Retrieve the current balance for an endpoint |
Prepaid Balance Management | POST | /endpoint/{endpoint_id}/balance | The balance of an endpoint can be updated, either by adding or subtracting a certain amount |
Prepaid Balance Management | DELETE | /endpoint/{endpoint_id}/balance | The prepaid balance can be reset, meaning it will start with a value of zero |
Quota Management Events
If a threshold percentage has been submitted and a data quota or SMS quota was created, and the remaining volume goes below that threshold. Threshold Reached
events are created for data and SMS thresholds.
In case the data or SMS quota volume is completely depleted, Quota Used Up
events are created.
Endpoint Events can be retrieved at /endpoint/{endpoint_id}/event.
Prepaid Balance Management
Balance management is activated in the service profile and will be applied to all endpoints assigned to that service profile. Initial balance is zero and endpoints are therefore blocked from any tele-services.