Azure Stream Analytics job expensive for small data?
Asked Answered
R

2

0

In order to write sensor data from an IoT device to a SQL database in the cloud I use an Azure Streaming Analytics job. The SA job has an IoT Hub input and a SQL database output. The query is trivial; it just sends all data through). According to the MS price calculator, the cheapest way of accomplishing this (in western Europe) is around 75 euros per month (see screenshot).

Actually, only 1 message per minute is send through the hub and the price is fixed per month (regardless of the amount of messages). I am surprised by the price for such a trivial task on small data. Would there be a cheaper alternative for such low capacity needs? Perhaps an Azure function?

enter image description here

Renfred answered 8/3, 2019 at 11:13 Comment(2)
If you don't need any statefull processing, such as windowing, for your data volume a simple Azure Function in consumption plan should indeed be cheaper. According to the pricing calculator for Functions: "The first 400,000 GB/s of execution and 1,000,000 executions are free."Resiniferous
Or, if you don't need near-realtime, you can also use the archive feature of IoT Hub to send raw data to blob storage. From there have an Data Factory job pick it up once an hour and write it to SQL DB. Not sure though, if thats cheaper than Functions. For you data volume, I'd probably go with a Function.Resiniferous
E
5

If you are not processing the data real-time then SA is not needed, you could just use an Event Hub to ingest your sensor data and forward it on. There are several options to move data from the Event Hub to SQL. As you mentioned in your question, you could use an Azure Function or if you want a no-code solution, you could us a Logic App.

https://learn.microsoft.com/en-us/azure/connectors/connectors-create-api-azure-event-hubs

https://learn.microsoft.com/en-us/azure/connectors/connectors-create-api-sqlazure

Excess answered 8/3, 2019 at 11:42 Comment(1)
Thanks for your answer. The data needs to be processed about once per hour. But the question remains, how to forward it on to the database?Renfred
D
2

In addition to Ken's answer, the "cold path" can be your solution, when the telemetry data are stored in the blob storage by Azure IoT Hub every 720 seconds (such as a maximum batch frequency).

Using the Azure Event Grid on the blob storage, it will trigger an EventGridTrigger subscriber when we can handle starting a streaming process for this batch (or for a group of batches within an one hour). After this batch process is done, the ASA job can be stopped. Note, that the ASA job is billed based on the active processing time (that's the time between the Start/Stop) which your cost using an ASA job can be significantly dropped down.

Detruncate answered 8/3, 2019 at 12:47 Comment(1)
Do you have an example which shows the usage of the Azure Event Grind and the EventGridTrigger?Superimpose

© 2022 - 2024 — McMap. All rights reserved.