Quantcast
Channel: MSDN Blogs
Viewing all 35736 articles
Browse latest View live

How to connect to a database from an Azure Function

$
0
0

I created a simple Azure Function and now I want to connect it to a database.  Check out these other articles as well.

  • How to create an Azure Function in Visual Studio
  • Deploy an Azure Function created from Visual Studio
  • Check out all my Azure Function articles here

I found this article also useful “Code and test Azure Functions locally”

I used ADO.NET to make this simple connection to a database.  Here is the code I used.

I included these namespaces:

using System;
using System.Data.SqlClient;

Keep in mind that this is only an example and there may be a few places to optimize the code pattern.  I am fine if you point them out, just for fun.

public static class Function1
    {
        private static SqlConnection connection = new SqlConnection();

        [FunctionName("Function1")]
        public static async Task<HttpResponseMessage> 
            Run([HttpTrigger
              (AuthorizationLevel.Anonymous, 
                "get", "post", Route = null)]HttpRequestMessage req, TraceWriter log)
        {
            log.Info("C# HTTP trigger function processed a request.");

            try
            {
                connection.ConnectionString = 
                    Environment.GetEnvironmentVariable("DatabaseConnectionString");
                await connection.OpenAsync();
                return req.CreateResponse(HttpStatusCode.OK, 
                    $"The database connection is: {connection.State}");
            }
            catch (SqlException sqlex)
            {
                return req.CreateResponse(HttpStatusCode.BadRequest, 
                    $"The following SqlException happened: {sqlex.Message}");
            }
            catch (Exception ex)
            {
                return req.CreateResponse(HttpStatusCode.BadRequest, 
                    $"The following Exception happened: {ex.Message}");
            }
        }
    }
[/Sourcecode]

</div>
While I was testing locally I wasn’t able to get the Environment.GetEnvironmentVariable() method to get the ConnectionStrings values contained in my local.settings.json file.  So I placed my connection string into the Values section and it worked fine.

When I executed the function locally using CURL I received the following successful result as seen in Figure 1.

<a href="https://msdnshared.blob.core.windows.net/media/2018/04/image109.png"><img width="855" height="135" title="image" alt="image" src="https://msdnshared.blob.core.windows.net/media/2018/04/image_thumb108.png" border="0" /></a>

<strong><span style="font-size: xx-small;" size="1">Figure 1, Azure Function App connecting to a database example</span></strong>

After I deployed my Function, see here “Deploy an Azure Function created from Visual Studio” the code to read the information from the local.settings.json did not work any more.  I received this, Figure 2, when I attempted to execute the Azure Function.
<div class="wlWriterEditableSmartContent" id="scid:C89E2BDB-ADD3-4f7a-9810-1B7EACF446C1:34cbb5ef-e63a-473c-827d-d1288e7572b7" style="margin: 0px; padding: 0px; float: none;">


"The following Exception happened: The ConnectionString property has not been initialized.<span style="display: inline !important; float: none; background-color: transparent; color: #333333; cursor: text; font-family: Georgia,'Times New Roman','Bitstream Charter',Times,serif; font-size: 16px; font-style: normal; font-variant: normal; font-weight: 400; letter-spacing: normal; line-height: 24px; orphans: 2; text-align: left; text-decoration: none; text-indent: 0px; text-transform: none; -webkit-text-stroke-width: 0px; white-space: normal; word-spacing: 0px;">"</span>

image

Figure 2, Azure Function App connecting to a database example

To make this work, I simply added an Application Setting from my Azure Function App named “DatabaseConnectionString” with a value of the ADO.NET SQL connection string, Figure 3.

image

Figure 3, Azure Function App connecting to a database example, cannot read from local.settings.json file

While I was messing around with the Application Settings I got these errors, Figure 4.

<span style="display: inline !important; float: none; background-color: transparent; color: #333333; cursor: text; font-family: Georgia,'Times New Roman','Bitstream Charter',Times,serif; font-size: 16px; font-style: normal; font-variant: normal; font-weight: 400; letter-spacing: normal; line-height: 24px; orphans: 2; text-align: left; text-decoration: none; text-indent: 0px; text-transform: none; -webkit-text-stroke-width: 0px; white-space: normal; word-spacing: 0px;">"</span>The following Exception happened: Keyword not supported: '"server'."
<span style="display: inline !important; float: none; background-color: transparent; color: #333333; cursor: text; font-family: Georgia,'Times New Roman','Bitstream Charter',Times,serif; font-size: 16px; font-style: normal; font-variant: normal; font-weight: 400; letter-spacing: normal; line-height: 24px; orphans: 2; text-align: left; text-decoration: none; text-indent: 0px; text-transform: none; -webkit-text-stroke-width: 0px; white-space: normal; word-spacing: 0px;">"</span>The following Exception happened: Format of the initialization string does 
  not conform to specification starting at index 0<span style="display: inline !important; float: none; background-color: transparent; color: #333333; cursor: text; font-family: Georgia,'Times New Roman','Bitstream Charter',Times,serif; font-size: 16px; font-style: normal; font-variant: normal; font-weight: 400; letter-spacing: normal; line-height: 24px; orphans: 2; text-align: left; text-decoration: none; text-indent: 0px; text-transform: none; -webkit-text-stroke-width: 0px; white-space: normal; word-spacing: 0px;">"</span>

image

Figure 4, Azure Function App connecting to a database example, cannot read from local.settings.json file

The first one I resolved by Restarting the Azure App.  I didn’t get a chance to troubleshoot it more.  Or it was because I had the Application settings value wrapped in double quotes…?

The second one was because I fat fingered my connection string, once I got the correct value in there, all worked fine.


How to create an Azure Function in Visual Studio

$
0
0

These are some good instructions as well “Create your first function using Visual Studio”.  But I like to try it out myself.  Check out these other articles as well.

First, open Visual Studio and create the Azure Function, similar to that shown in Figure 1.

image

Figure 1, how to create an Azure Function using Visual Studio

Select the trigger type, I chose the below configuration, I.e. Http Trigger as seen in Figure 2.

image

Figure 2, how to create an Azure Function using Visual Studio

Without making any changes, I pressed F5 and was then asked to install the Azure Functions CLI tools, Figure 3 and I did that.

image

Figure 3, how to create an Azure Function using Visual Studio

The simulator started up, as seen in Figure 4.

image

Figure 4, how to create an Azure Function using Visual Studio

Once the CLI tools were installed, the CMD window, seen in Figure 5 opened up and provided me the URL to test the Azure Function.

image

Figure 5, how to test local an Azure Function using Visual Studio

Then I used CURL from another CMD window to call and test the Azure Function, see Figure 6.

image

Figure 6, how to test local an Azure Function using Visual Studio

That was too easy, Azure Functions are pretty cool.

Be sure to check out my other Azure Function articles here.

After I published my Function, I had some problems:

I resolved them, but the thing is that when I access the Function in the portal I have limited means of updating there in the portal itself.  This if fine, since I decided to create from Visual Studio, I will need to develop, test and publish in Visual Studio from this point on.  Do not think that you can go back and forth between developing in the portal and in Visual Studio.  It is either or and you need to decide how you want to develop.  When created from this direction, I see this, Figure 7 in the portal when I navigate to it.

image

Figure 7, how to test local an Azure Function using Visual Studio, azure portal

You can consider creating the Azure Function App and Function, then downloading the project and doing some developing and testing in Visual Studio, then cut & paste the changes from Visual Studio into the Azure Function IDE in the portal.

Rendering in the DirectX Shader Compiler Editor

$
0
0

Last time we looked at how to disassemble a shader in the DirectX Shader Compiler Editor.

If you looked carefully at the code inserted when you use Ctrl+N to include a new document, you'll see that there's an #ifdef'd section near the bottom.

This #ifdef'ed section contains a bit of XML written in the same format used for execution tests in the compiler. You can place the cursor on an XML element name and press F1 to get a quick reference.

The idea behind this is that it describes a single Dispatch or Draw call with very fine-grained control, for example to check whether heaps laid out in specific ways are behaving properly.

Note that the implemented area is extremely basic and there are many features of D3D12 that aren't supported; however, you may find this useful for prototyping.

To run the operation, you can select the View | Render option.

It's an excellent good idea to turn on the Output window to see the debug spew generated. The render runs with the validation layers enabled, so you will often find useful information there.

Finally, you have the option of using shader model 5.0 and 5.1 as well, both for disassembly and for rendering. To change the shader model used for disassembly, update the hlsl-target value specified in the first line of the file. To change the shader model used for rendering, update the Target attribute on the Shader elements in the XML section.

There is very little specific to the compiler here, other than the fact that it's very easy to try things out and so we use it when we want to look at a tiny bit of behavior in isolate - and so can you, including for cases when you're putting together a quick repro.

Enjoy!

Docs for Dynamics NAV 2016 and earlier are now on Docs.microsoft.com

$
0
0

For almost 10 years, you might have become used to finding content online for Dynamics NAV in the MSDN Library. We announced last year that we're moving to the Docs.microsoft.com site, and today we moved the last bits to the Docs site as well.

Online content for Microsoft Dynamics NAV 2016 and earlier versions of Dynamics NAV is now available in the Previous Versions section of the Docs site.

Here are the direct links for each version:

If you have bookmarked an article in the MSDN Library, you will be directed to the same article on the Docs site. You'll be able to use the Search field to search for specific articles, be that how to maintain fixed assets in Dynamics NAV 2015 or how to use the Compressed property in Dynamics NAV 2016.

But we highly recommend that you read the latest version instead: How to maintain fixed assets in Dynamics NAV or in Business Central. Also, read the latest article on how to use the Compressed property in Dynamics NAV - or in AL. Those docs are still being updated by the team on a regular basis, and you can use the breadcrumbs just below the blue banner to navigate between the different sections.

Stay tuned for an update on how you can adopt, customize, extend, and collaborate with Microsoft on the latest versions here on the blog.

Best regards,

The Dynamics NAV content experience team

1st year Computer Science project – Food Diary web app powered by Microsoft Cognitive services

$
0
0


Guest post by Warren Park University College London, Computer Science Student,

image

I am a first-year student at UCL studying computer science. As part of the course, I was being assigned to a team and had to develop an application for Microsoft.

The title of the project was “Microsoft Cognitive Node.js app with cloud storage support”, and I was the team leader. After having a meeting with the client (Anze Vodovnik from Microsoft), me and my team members have decided to make a food diary web app for people who need to record the food that they have eaten on a day due to their allergies, or for people who are interested in recording the foods that they have eaten daily.

My team has got many ideas for the application, and decided to develop a web app that retrieves a food image and a face image pair from cloud storage providers (OneDrive, Google Drive and Dropbox), analyses the image by using Microsoft Vision API (for generating tags and food title) and Face API (for determining the feeling in the face to determine the degree of satisfaction about the food), and then shows the analysis result on the web app in the form of card like below:

image

My team had three members including me, and each person had different roles.

I was responsible for the server-side web app (front and back-end) programming, UI design (i.e. client-side front-end) and programming, client liaison and report editor, Wei Tan was responsible for the client-side web app backend research and programming. Since he was showing significant progress at the end, he was also assigned to some server-side backend function programming tasks and implementations of auxiliary features. In addition, he worked as a copy editor for the reports. Furthermore, Terry Williams was responsible for the client-side web app research and programming as well as the copy editor for the reports. He was also the video editor.

Since I was responsible for dealing with the Vision API and Face API, I firstly started from reading documentation from the Microsoft website.

I have read the documentation at http://docs.microsoft.com and have found that the Microsoft Cognitive services are very easy to use since most of them happen in a simple HTTP request. Sending images using octet stream or URL was implemented quickly in the web app.

Then, I have implemented all of the Microsoft Cognitive services needed for the application and found that if I use an npm library called “cognitive-services”(https://www.npmjs.com/package/cognitive-services ), implementation can be done even more concisely. The npm library can handle most of the Microsoft Cognitive services and can handle various functionalities of each API separately. I have decided to use this library instead of pure HTTP request and was able to develop the app quicker.

When most of the backend development had been finalised, I had to find a storage provider that can handle images securely and easily. As the staff from Microsoft has recommended, I have tried to utilise Azure storage especially the Blob storage.

Using Blob storage, I was able to upload image files directly without conversions to the storage and also was able to directly download image files from the storage. Since there were no conversion steps were required, it was easier for me to handle images. Microsoft had provided an npm module called “azure-storage”( https://www.npmjs.com/package/azure-storage ) to enable developers to use Azure storage services with predefined upload and download functions, so there was no need for me to write separate functions that handle uploading and downloading.

Finally, me and my team members were able to develop a web app that can create a card that includes a single food diary entry with the analysis results from the Microsoft Cognitive services, search through the cards, suggest a restaurant based on the accumulated food eating history and present default location, and show feeling trends over time.

image

When the deployable version of the web app has been developed, my team needed a web hosting service that can handle our web application. I have found that Azure provides effective Node.js web hosting service, and I have tried to deploy the web app by using the GitHub repository which demonstrated our understanding of DevOps and continuous development and deployments.

Using GitHub repository, Azure web app service automatically set up the application that I did not have to “npm install” from the console. It was able to show “Failed” if there was something wrong with the deployment as well as the log generated by the system, which was helpful for debugging the web app.

The overall architecture of the system had 6 components which are illustrated below:

image


Firstly, the download server is a virtual server (Azure web app service) that only fetches the resources e.g. images and HTML, JavaScript framework files, to the client requiring them. The resources are given by using HTTP.

Secondly, the client downloads HTML pages and JavaScript resources from the download server. The client also obtains the tokens and file ids from the cloud storage service providers based on user authentication and file selection. It sends appropriate HTTP requests (including search, delete and edit) to the main server and displays the data retrieved from the server. Based on the data retrieved, the client can also retrieve cards and images stored in the Azure storage by using a SAS token. Several other features including map viewing, Q&A, user sign-ins and about page are also supported by the client.

Thirdly, the cloud service providers allow users to authenticate their accounts and enables the client side of our web application to retrieve the access token. They also enable downloading of images, file name and file IDs.

Fourthly, the main server (also a virtual server) is the place where all the data is processed and enables suggestion, search, edit, delete, and creation of cards. By separating the server that utilises a lot of computational resources, it makes dealing with the resource related problem easier - even if the main server is down and unable to serve, the download server could still display notices or at least show the server not responding message on the client-side web app.

Fifthly, MySQL DB stores indexes of each card. It stores essential data per each entry that need to be searched and therefore enables searching of the cards to happen efficiently and effectively.

Finally, other service provider includes Microsoft Cognitive service servers, Microsoft Graph API server for the user authentication (user sign-in), Google Maps for maps and geocoding, Microsoft Azure Storage, and download servers of the cloud storage providers. The client can only connect to the limited number of providers in order to provide better security since the majority of the API keys and secrets will not be revealed to the client who can also be a hacker. For the connection with the Azure Storage, clients use SAS key instead of a normal key to enable the developer to control the access permission of the users, since SAS keys in the client-side web app will be visible to the users by using the developer mode of the browsers.

All of the connections between each system components will be using HTTP except the connection between the Microsoft Graph API which uses HTTPS.

Having the architecture described above, the system was able to provide a framework for the other application developers while ensuring the functionality of the web app.

Overall, this project was a fascinating development experience for my team. I think this development would not have been possible if Azure or Cognitive services was not existing. Thus, I highly recommend Azure services to any person who wants to develop AI enabled services or services that handle images.

*The web app developed is available on https://food.azurewebsites.net

*Public GitHub repository including all the source codes and report is available on https://github.com/Warren-Park/UCL-Cognitive-TEAM45-_Food_Diary

Microsoft Technical content for students and educators http://aka.ms/azurestudentlabs for easy access

$
0
0

 

Free technical resources for faculty, students, and Microsoft developer advocates for use in computer science learning forums. at

Azure Educator Services GitHub Repo

 

This repo provides technical resources to help students and faculty learn about Azure and teach others. The content covers cross-platform scenarios in AI and machine learning, data science, web development, mobile app dev, internet of things, and DevOps.

 

Students can get free Azure credits to explore these resources here:

Your feedback is appreciated - please fork this repo and contribute!

To report any issues, please log a GitHub issue. Include the content section, module number and title, along with any error messages and screenshots.

 

March Update

1.Added new super simple Azure web apps lab (students should be able to get through it in 30min or less, and it integrates/introduces GitHub! Perfect on boarding for total beginners or as a short ramp-up before a hackathon, etc.)

2.Added new Azure Video Indexer lab

3.Totally revamped Azure Bot Service lab

4.Updates to text on home page to highlight Azure for Students offer as well as other small tweaks.

5.Restructured content to highlight labs, and separate content into Labs, Educator content, and Events content

6.Added descriptions to all labs, making it easier for users to find what they’re interested in learning/teaching (example)

7.Added approximate pricing ($, $$, $$$) to all labs, so users know about how much they’ll spend in Azure while going through the lab (example)

8.Updates to existing labs to keep up-to-date with Azure product changes

9.Cleaned up issues backlog

10.Cleaned up pull request backlog

11.Added new short url http://aka.ms/azurestudentlabs short link to make it easy to get to our repo

 

Changes Planned for April

1.New Blockchain lab!

2.New Tensor Flow lab!

3.More updates to existing labs to stay up-to-date (as always).

 

New labs NOW Available

Azure Web Apps + GitHub

Azure Video Indexer

Azure Bot Service

 

Understanding the cost of Azure and Cloud Services

•Each lab clearly indicates the $ cost of using Azure to all labs including scenario, technology, and cost ($, $$, $$$) further costs of azure can be assessed by using the Azure Pricing Calculator https://azure.microsoft.com/en-us/pricing/calculator/

 

Content Organisations at http://github.com/msftimagine/computerscience http://aka.ms/azurestudentlabs

Labs AI/ML, Azure Services, Big Data, Deep Learning, IoT, Web Dev
Events Event-in-a-Box, Tech Talks, Azure U Tour (eventually)
Educator content Scripts, Azure Guides, Course Content

Available Labs and Resources

Machine Learning/AI

Azure Bot Service

Azure Machine Learning

Azure Notebooks

Cognitive Toolkit

Custom Vision Service (new!)

LUIS

Video Indexer

 

Web Development

Azure Web Apps

Azure Web Apps and GitHub (new!)

image

Deep Learning

200 - Machine Learning in Python

300 - Neural Networks with CNTK

400 - Image Classification with MMLSpark

400 - Stream Analytics and Machine Learning

 

Big Data & Analytics

Azure Data Lake

Hadoop on Azure HDInsight

Spark on Azure HDInsight

Microsoft Power BI

 

Internet of Things

Azure Stream Analytics

 

Azure Services

Azure Container Service

Azure Functions

Azure High-Performance Computing (HPC)

Azure Storage

Blockchain on Azure

VM Scaling

Outbound mail queued up at Edge Server with 451 4.4.0 DNS query failed

$
0
0

I have came across a couple cases with this issue as of recent and thought enough to create a blog post on, since if I see the issue on multiple people calling into MS support for an issue with the same fix, I'm sure others are experiencing it as well. I'll keep this short and sweet though. The recent issue I've experience, a customer implemented a new Exchange environment with Edge servers, and could receive inbound emails fine, but email was failing outbound on their Edge servers with the error below in the queue:

LastError: {LED=451 4.4.0 DNS query failed. The error was: DNS query failed with error ErrorRetry -> DnsQueryFailed

This was for all external domains we got the DNS query failure.  We verified via nslookup that we were able to resolve external domains MX records just fine, so there wasn't an issue with the DNS servers we had configured.  The issue turned out to be because of one simple (advanced) setting on the NIC "Register this connection's addresses in DNS".

After we checked this setting we were able to successfully resolve MX records and send outbound mail!

Now you may be asking why this would be the case, when clearly we are able to resolve DNS records with our configured DNS servers?  My theory is that this is due to the Exchange 2013/2016 code design differences that we had in Exchange 2010.  For instance, in Exchange 2010, this was just a checkbox for the DAG Network in the EMC if you wanted a network to be mapi/replication enabled or not.  When managing your DAG networks in Exchange 2013/2016, Exchange sees networks as "Replication Enabled" when the same "Register this connection's addresses in DNS" option is checked on the NIC or not.  If it's checked to register in DNS, Exchange thinks "oh this is a resolvable NIC address, this must be a MAPI network", and if it's not a registered DNS address, then we are probably using this for replication traffic, and it's not a client-accessible network.  My theory is that Transport uses this same logic for when it tries to do MX lookups for external domains.  If the NIC isn't registered in DNS, then we won't do DNS lookups.

 

Hopefully if you're having these symptoms this will help you out!

//DannyP

Experiencing errors while creation of Application Insights app using Visual Studio – 04/02 – Mitigating

$
0
0
Update: Thursday, 05 April 2018 22:28 UTC

Hotfix has been successfully deployed and validated in EUS region. ETA for hotfix rollout completion to all regions is EOD today.
  • Work Around: Apps can be created using Azure portal without any issues
  • Next Update: Before 04/06 23:00 UTC

-Dheeraj


Update: Tuesday, 03 April 2018 22:24 UTC

Hotfix has been successfully deployed in Canary and BrazilUS regions. Currently, we are trying to prioritize this Hotfix rollout for other regions in the order of EastUS, SouthCentralUS, WEU, WUS2, Southeast Asia and NEU. Current ETA for Hotfix rollout across all regions is EOD Friday.
  • Work Around: Apps can be created using Azure portal without any issues
  • Next Update: EOD Friday

-Dheeraj


Update: Monday, 02 April 2018 21:53 UTC

We identified the root cause for this issue. To fix this, we are moving forward with Hotfix deployment in this order: EastUS, SouthCentralUS, WEU, WUS2, Southeast Asia, NEU. Currently we have no ETA on resolution and trying to expedite the rollout of this hotfix.
  • Work Around: Apps can be created using Azure portal without any issues
  • Next Update: Before 04/03 22:00 UTC

-Dheeraj


Initial Update: Monday, 02 April 2018 16:55 UTC

We are aware of the issues within Application Insights and are actively investigating. Customers creating a new project with Application Insights on by default in Visual Studio 2015 will see a failure message as below-

'Could not add Application Insights to project. Could not create Application Insights Resource : The downloaded template from 'https://go.microsoft.com/fwlink/?LinkID=511872' is null or empty. Provide a valid template at the template link. Please see https://aka.ms/arm-template for usage details. This can happen if communication with the Application Insights portal failed, or if there is some problem with your account.'


  • Work Around:  Apps can be created using Azure portal without any issues
  • Next Update: Before 04/02 21:00 UTC

We are working hard to resolve this issue and apologize for any inconvenience.


-Sapna


Students hack for social good @ first-ever Vatican Hacks

$
0
0

How wonderful would it be if the growth of scientific and technological innovation would come along with more equality and social inclusion?”
- Pope Francis

Recently we had an opportunity to support and mentor 120 students @ VHacks, the first-ever student hackathon hosted at Vatican City. They represented 60 different universities in 28 countries – from UC Berkeley to Georgetown, Chonnam National University in Korea to National Taiwan University. It was a celebration of so many cultures and beliefs, bringing together bright students with diverse academic, ethnic, and religious backgrounds. And these students were challenged to tackle some of the most difficult issues of our time such as social inclusion, interfaith dialogue, and aiding migrants & refugees.

To say these students rose to meet this challenge would be an understatement. Most of the teams had never met each other prior – their ability to ideate, form consensus, and develop a shared mission was inspiring. Their talents in design, development, and entrepreneurship were impressive. And in just 36 hours, they expressed their ideas through cutting-edge tech like Artificial Intelligence and Mixed Reality. Let’s take a look at just a sample of what these students created…

Zelixa – helping dyslexic people train themselves to read better

We were so impressed by the combination of vision and ambition that Team Zelixa brought to VHacks. The team used Hololens and Azure Computer Vision to analyze natural user inputs (eye position & focus) and printed reading materials (sentence structure, words, syllables, grammar, typefaces, and colors) to help people with dyslexia read more fluently in real-time. In the relatively short (36-hour) hackathon format, they achieved this by using a set of pre-trained, automated services like Optical Character Recognition (OCR) to output to a JSON file, breaking-down the specific attributes of text. The team researched common reading behaviors found in people diagnosed with dyslexia and built a model that transformed then converted the text into new images. Then, they used the Hololens to spatially map those images over the user’s reading experience in real-time. Nice! Sounds like a good project for Imagine Cup.

Zelixa image

Vinculum – reuniting refugee families with their loved ones

Every year, tens of thousands of migrants and refugees lose one another in the process of relocation. Oftentimes they don’t have identity documentation with their family name or previous address. Team Vinculum had an idea that might help refugee camps have higher success rates in reuniting families especially over long distances. Using Face API – part of Azure Cognitive Services – they analyzed digital and printed photographs for common features that might suggest a parent-child relationship. By cross-referencing refugee camp records with their experimental Azure Machine Learning matching model, they might securely help local authorities have confidence in sending family members to the correct location. One of the particularly elegant things about Vinculum’s project was that they built a very lightweight web app noting that camps often have very limited internet bandwidth for large populations; every bit and every pixel counts when analyzing new images for a potential match.

You can learn more about the 9 finalist teams here!

Students – start building with Artificial Intelligence and Mixed Reality alike

These students at Vhacks just scratched the surface of what’s possible when you combine technologies like Azure’s AI and Windows Mixed Reality devices. What they did in just in 36 hours was quite amazing. Try these out for yourself here:

Thank you!

This event took a huge team of visionaries and organizers alike. We had the opportunity to just show-up and mentor students – the fun part – so we’d like to thank the team of passionate people and orgs that made it all possible…

  • Students at Harvard and MIT for your countless hours in bringing us all together
  • OPTIC, a Vatican-affiliated global think-tank dedicated to ethnical issues of disruptive technologies for your vision and initiative
  • Vatican’s Secretariat for Communication for your leadership and inspiration
  • The Pontifical Council for Culture and Migrants & Refugees Section of the Dicastery for Integral Human Development – because many of you stayed-up all night to help the students
  • Our fellow dev-advocates and mentors at Google – sharing our collective talents with the students alongside you. And geeking-out together over Hololens demos. 😊
  • Anyone else we may have missed who made this possible – thank you!

de:code 2018 への想い

$
0
0

皆さん、こんにちは。
早くも 4 月に突入して、新年度が開始となりました。de:code 2018 の早期申込割引締め切りは、4/24  (火)までとなっており、例年より若干早く締め切りとなりますので、ご注意ください。
参加をご検討している方より、「今年の
de:code ではどのような発表があるの?」、「スペシャルゲストは誰?」、「セッションの詳細を知りたい」というお声を多々いただいております。参加者の全ての方にわくわく、楽しんでいただけるコンテンツを弊社で絶賛企画中ですのでお楽しみに!
de:code 2018 で構成される技術トラックは、「AI」、「App Client」、「App Development」、「Cloud Infrastructure」、 「Data」、「Special」合計で 6 トラックです。本日は、全セッションの総ディレクターである 大田昌幸がナビゲーターとなり、各技術トラックの担当者とともに、de:code 2018 でご紹介する各技術トラックの見どころをビデオにてご紹介していますので、ぜひご覧ください。

--------------------------------------------------------------------------

  • 公式ウェブサイトはこちら
  • 早期割引申込はこちら
    • 早期申込割引締切日:2018 年 4 月 24 日(火)
  • セッション情報はこちら
  • SNS

--------------------------------------------------------------------------

Database Migration Guide – March 2018 updates

$
0
0

What’s new with the latest updates?

Call to action

Please take some time to review the Migration Guide and consider:

  • The accuracy and scope of existing content.
  • Any content/learnings/guidance that you feel should be added.
  • Any valuable content/learnings/guidance that you can contribute.
  • The overall user experience.

After you have a chance to work with the Guide, please be sure to submit any feedback you might have by using the links in the footer of the Migration Guide scenario pages:

Thanks in advance for your support!

Azure Content Spotlight – Building Mobile Apps

$
0
0

Welcome to another Azure Content Spotlight! These articles are used to highlight items in Azure that could be more visible to the Azure community.

Azure provides a rich set of backend services for building mobile solutions. This week's highlight is a collection of resources put together by Rob Caron: Essential tools and services for building mobile apps. The post features cross-platform mobile development with Xamarin and Visual Studio as well as native iOS and Android development resources but this highlight is about the Azure services featured in the Azure Friday video, namely Azure Functions and Cosmos DB.

The Azure Friday video provides an excellent illustration of how a single developer can easily roll out a global mobile solution.

Cheers!

Experiencing Data Access Issue in Azure Portal for Many Data Types – 04/06 – Resolved

$
0
0
Final Update: Friday, 06 April 2018 02:32 UTC

Application Insights experienced data access issues for around 25 minutes. We've confirmed that all systems are back to normal with no customer impact as of 04/06, 02:10 UTC. Our logs show the incident started on 04/06, 01:45 UTC and during the 25 minutes that it took to resolve the issue, around 13% of customers would have experienced data access issue. The following data types were affected: Availability,Customer Event,Dependency,Exception,Metric,Page Load,Page View,Performance Counter,Request,Trace.

  • Root Cause: The failure was due to availability issues in one of our back-end services in EastUS region.
  • Incident Timeline: 25 minutes - 04/06, 01:45 UTC  through 04/06, 02:10 UTC

We understand that customers rely on Application Insights as a critical service and apologize for any impact this incident caused.

-Sapna


Webinar 4/12: Getting Started with Model Driven Apps in PowerApps

$
0
0

Getting Started with Model Driven Apps in PowerApps

Work along with Adrian as he walks you through the basics of how to get started with Model Driven Apps. He will help you to understand:
-What types of apps are prime candidates for Model Driven App Builds
-How to build your first Common Data Service Database
-How to navigate the user interface and designer components to build an quick app survey
-How to share and test your new model driven app

When: 4/12/2018

Where: https://www.youtube.com/watch?v=buDDSzJTgns

 

Adrian Orth

, Principal Program Manager

 

Agility and Feedback with OneNote

$
0
0

Educators are constantly assessing their students on content knowledge. We often walk around the classroom to see where students are at, whole class probe, and commonly use "exit tickets" as a strategy for formative assessment. The process can feel both repetitive and limiting.

Thankfully, all of this can be done without the educator being "trapped" at the front of the room by a wired projector. Using Miracast technology, exclusive to Windows, educators can easily and wirelessly connect their computer to the projector. This allows the teacher to move around the room, help individual students, redirect off-task behaviour, make notations about student progress in their notebook, and assess student comprehension through polling the whole class at once. Never once must the teacher return to the front of the classroom.

Providing students with live feedback, enriches their learning outcome, keeps them focussed and ensures there is a transparency through the learning process. Students no longer fear what their feedback will look like the following day but be able to address it in the moment to ensure a maximised learning outcome can be obtained.

Using Microsoft Tools and Wireless device connectivity alone, has the power to innovate an educators pedagogical approaches to effective assessment in the classroom.

 


 

Click on the link below to access the online tutorial on how to use provide feedback using OneNote.

https://education.microsoft.com/Story/Article?token=em5z9


Top 3 Tips to provide feedback using OneNote:

  1. Draw and Ink as you would on Paper - Using the pen on your device to annotate student work (click on the draw tab and choose a pen colour)
  2. Record Audio and Video and comment verbally as you annotate using the OneNote native radio and video recording tools
  3. Each group member types in different colours, or the teacher chooses a different colour to the student work sample

Ready to complete the CPD course on line and earn your badge? Click below now! 

 


Kali Linux for Windows

$
0
0

This is pretty cool as I am very interested in this specific flavor of Linux.  I have installed it and it's very cool.  You can down load this from the Microsoft Store.

image

Then search for “Kali” and install it.

Here is the description:

“The Kali for Windows application allows one to install and run the Kali Linux open-source penetration testing distribution natively, from the Windows 10 OS.  To launch the Kali shell, type ‘kali’ on the command prompt, or click on the Kali title in the Start Menu.  The base image does not contain any tools, or a graphical interface in order to keep the image small, however these can be installed via apt commands very easily.  For more information about what you can do with this app check here.”

image

Once you install it, you can open it as per the instructions, enter ‘Kali’ from a command prompt or select if from the Start Menu.

image

I did receive this when I attempted to run it for the first time.

Error: 0x8007019e The Windows Subsystem for Linux has not been enabled.

image

Simply follow the instructions here and you will make it work, all is good.

But for starters, you needed to enable Windows Subsystem for Linux, just like the exception stated, it requires a REBOOT to complete.

Enable-WindowsOptionalFeature -Online -FeatureName Microsoft-Windows-Subsystem-Linux

And here……..we……..GOooooooooo!

image

Rock!

Performance Degradation in West Europe – 04/06 – Investigating

$
0
0

Update: Friday, April 6th 2018 10:59 UTC

Our DevOps team continues to investigate issues with Slow and Failed commands in West Europe. Root cause is not fully understood at this time, though it is suspected to be related to AAD issues in the region. The problem began at 09:15. We currently have no estimated time for resolution.

  • Next Update: Before Friday, April 6th 2018 13:00 UTC

Sincerely,
Niall


Initial Update: Friday, April 6th 2018 09:54 UTC

We're investigating Performance Degradation in the VSTS services in West Europe.

  • Next Update: Before Friday, April 6th 2018 10:54 UTC

Sincerely,
Niall

Experiencing Data Access Issue in Azure Portal for Many Data Types – 04/06 – Resolved

$
0
0
Final Update: Friday, 06 April 2018 10:44 UTC

We've confirmed that all systems are back to normal with no customer impact as of 04/06, 9:45 UTC. Our logs show the incident started on 04/06, 9:15 UTC and that during the half an hour that it took to resolve the issue approximately 5% of customers experienced data access issues in the Azure Portal and in the App Analytics Portal.
  • Root Cause: The failure was due to performance degradation in one of our dependent platform services.
  • Incident Timeline: 30 minutes - 04/06, 9:15 UTC through 04/06, 9:45 UTC

We understand that customers rely on Application Insights as a critical service and apologize for any impact this incident caused.

-Varun


Initial Update: Friday, 06 April 2018 10:00 UTC

We are aware of issues within Application Insights and are actively investigating. Some customers may experience Data Access Issue in Azure Portal. The following data types are affected: Availability,Customer Event,Dependency,Exception,Metric,Page Load,Page View,Performance Counter,Request,Trace.
  • Work Around: None
  • Next Update: Before 04/06 12:00 UTC

We are working hard to resolve this issue and apologize for any inconvenience.
-Varun

Business Central developer docs are here!

$
0
0

We’re pleased to announce that the Business Central Developer documentation is available here: https://docs.microsoft.com/dynamics365/business-central/dev-itpro.

This is where you will see all new updates to AL documentation going forward, so please remember to bookmark this page. With the availability of our Business Central cloud-based offering, we have moved our documentation to this new URL based on a new repo. Dynamics NAV and C/SIDE documentation continues to be available here: https://docs.microsoft.com/dynamics-nav/getting-started.

 

Feedback and Contributions

With this new public repo, we have enabled the new feedback mechanism that distinguishes between product feedback and documentation feedback. The feedback is done on a per topic basis. If you have product feedback, you will be sent here. Documentation feedback will be registered as GitHub issues allowing us to track and respond to incoming bugs, suggestions, questions etc. All you must do to provide documentation feedback is to sign in with a GitHub account. Or, if you have ideas or would like to make a contribution you can go directly to our public repo here: https://github.com/MicrosoftDocs/dynamics365smb-devitpro-pr and propose a change.

 

Let us know what you think! We’re continuously working on updating and improving the developer help for Business Central so stay tuned.

 

Best regards,

NAV UA Platform

 

 

Upgrading to Dynamics NAV 2018 from Dynamics NAV 2013 or Dynamics NAV 2013 R2

$
0
0

If you want to upgrade a customer's database from Dynamics NAV 2013 or Dynamics NAV 2013 R2, you may have noticed that the latest Dynamics NAV 2018 cumulative update does not have a direct path to upgrade from these earlier versions.

The solution is to upgrade to Dynamics NAV 2018 Cumulative Update 2, and then upgrade to the latest Dynamics NAV 2018 cumulative update.

For more information, see Upgrading to Dynamics NAV 2018, which has been updated accordingly.

We hope this helps those of you who have been confused about Dynamics NAV 2018 Cumulative Update 3, and we apologize for the experience you had. Going forward, please use the processes outlined in Upgrading to Dynamics NAV 2018.

Viewing all 35736 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>