the following code snippet details on how to do pagination is cosmos db , we just need to pass a pagesize and the continuation token, when we are calling the method for the first time, we can just pass in a null and for the subsequent requests, we can pass in the continuation token, which is returned as a part of the response of this method.
public async Task<ProductResponse> GetAllProducts(int pageSize, string continuationToken)
{
List<Product> products = new List<Product>();
ProductsResponse productsResponse = new ProductsResponse();
QueryDefinition queryDef = new QueryDefinition("select * from Prodcuts p");
string token = null;
if (!string.IsNullOrWhiteSpace(continuationToken))
token = continuationToken;
using FeedIterator<Product> resultSet = _container.GetItemQueryIterator<Product>(queryDefinition: queryDef, continuationToken: token, new QueryRequestOptions { MaxItemCount = pageSize });
{
var items = await resultSet.ReadNextAsync();
foreach (Product item in items)
{
products.Add(item);
}
productsResponse.ContinuationToken = items.ContinuationToken;
}
productResponse.Products = products;
return productResponse;
}
Azure Cosmos DB serverless went GA last week, but the documentation has not caught up yet, if you are looking to provision an Azure Cosmos DB serverless account using Bicep you can use the following snippet, the magic here is the capabilities section.
issue: when an exception happens in azure datafactory notebooks only just a url contianing the details is sent back to the datafactory, it does not go to the error flow.
one of the solutions we implemented was to catch an exception in the notebook is to send a json string containing the following
{
“isError”:”true”
“ExceptionDetails”:”some exception happend”
}
then datafactory will parse the json object log it if it wants and then choose the appropriate flow to handle the exception.
if you want to terminate the pipeline then do something like the following, where in use dynamic sql to raise an error.
OOB Hashicorp doesn’t offer an Terraform HTTP API/SDK for provisioning resources to cloud providers using terrafrom templates,(Their proprietary cloud does an awesome job of doing this).
However on the other hand Azure and AWS have an API (SDK) for provisioning resources using ARM and cloud formation templates.
For this POC i went ahead and created a wrapper for the terraform CLI to create an resource group in Azure.
The wrapper is written in Azure C# functions running locally, the second post of this series will cover the same functions running in Azure (hopefully).
The steps that i took are as follows
Installed AZURE CLI and logged in.
Created a Azure Functions Project
Copied the terraform template, and the terraform cli into windows temp folder (C:\Users\<<username>>\AppData\Local\Temp)
As you know based on my previous posts that i recently bought a fitbit ionic for inbuilt GPS and music player, i have been using playerfm on mobile forever, one of the nice features of palyerfm is to increase/decrease the audio speed, and i am habituated to listening at 1.2x speed, now without these option in the native ionic music player, i had to fallback on editing the podcasts on my machine by using ffmpeg.
All you have to do is run the following cmd, use the atempo parameter to adjust the speed accordingly.
I have been interviewing for a while, for my team for other teams…etc, most of the applicants are more than apt with their technical skills and logical reasoning, if you are still not sure about their technical skill set ask them to do a quick hands on exercise or ask them give a code walkthrough of their github project.
But how do we decide on who to hire? what we should be asking ourselves is “would I have him on my team” if you answer yes then nothing else matters just go ahead and select him. Technical skill can be worked upon if the applicant is a good fit for you team and yes apart from their skill set your gut feeling does matter.