Quantcast
Channel: MSDN Blogs
Viewing all 35736 articles
Browse latest View live

Log Analytics: Usage and Estimated Costs


Lesson Learned #39: Best Practices using BCP in Azure SQL Database

$
0
0

In multiple support cases our customers asked questions about what are the best practices to avoid fill up the transaction log or temporal database when they are importing a high number of rows using BCP.

In this new video in Spanish and English you could find these best practices, also, how to check the compatibility issues with Azure SQL Database, how to use the BCP files generated by bacpac process or how to automate the process using PowerShell.

Enjoy!

Azure Event Grid

$
0
0

Jednou z novinek nedávno uvedených do Microsoft Azure je služba Event Grid. Jejím úkolem je zajistit zprostředkování předávání krátkých zpráv (událostí) v prostředí internetu. Příkladem takové událostí, kterou máme zájem předat dále, může být například vytvoření nové objednávky v e-shopu.

Technicky Event Grid funguje jako zprostředkovatel komunikace v rámci principu Publisher-Subscriber. Výhodou tohoto řešení je především fakt, že publisher nemusí znát své subsribery, prostě zprávu pošle na Event Grid. K předávání zpráv ve tvaru strukturované datové věty JSON je využíván protokol HTTP. Stejně jako u ostatních služeb Microsoft Azure, je i Event Grid účtován v režimu pay-as-you-go podle počtu zpracovávaných operací. Služba zároveň nabízí přehledný výpis využívání až na úroveň jednotlivých událostí.

 

Konfigurace Event Gridu

Službu Event Grid lze využívat ve dvou základních scénářích. Prvním scénářem je distribuce událostí související se stavem a využíváním ostatních služeb v rámci vašeho předplatného na Microsoft Azure. Prostřednictvím služby Azure Logic App Designer tak lze jednoduše naklikat aplikaci, která například při přidání nového uživatele do předplatného vygeneruje e-mail s podrobnostmi a zašle jej do schránky vlastníka. Konkrétně lze ukázku vidět v tomto videu od času 5:15 https://azure.microsoft.com/en-us/blog/introducing-azure-event-grid-an-event-service-for-modern-applications/.

 

Druhým možným scénářem je zprostředkování komunikace aplikací mimo Microsoft Azure prostřednictvím ručně vytvořené instance služby Event Grid. Instanci lze vytvořit prostřednictvím skriptu nebo webového portálu Microsoft Azure. Základní jednotkou je zde v obou případech Event Grid Topic, který představuje logický uzel propojení konkrétních publisherů a subscriberů. Po vytvoření Event Grid Topic jsou vygenerovány klíčové atributy Topic Endpoint a Access Keys. Jedná se o URL adresu pro zasílání událostí ze strany publisherů a klíče pro autentizaci jednotlivých HTTP requestů.

 

V rámci záložky Overview lze pak definovat pro daný Topic jednotlivé subscribery. Nejběžnějším scénářem je tzv. Webhook, tedy jakési navěšení nezávislé webové aplikace (subscribera) do průběhu logického procesu. Součástí definice subscribera je URL endpointu pro přeposílání zpráv, konfigurace typu událostí, které se na tohoto subscribera budou přeposílat, a případné upřesnění filtru obsahu pomocí zadání prefix/sufix. Ve všech třech případech se jedná se o textově definovaný parametr. Aby bylo možné začít definovanému subscriberovi data, je nutné nejdříve ověřit, že má zájem o přijímaní našich událostí. Jedná se tak vlastně o prevenci DDOS útoků, ke kterým by byla jinak služba Event Grid snadno zneužitelná. Autentizace probíhá formou vložení krátkého kódu do aplikace, přičemž tento kód pouze přepošle zpět hodnotu z testovacího volání daného endpointu. Pokud hodnota dorazí zpět, Event Grid začne tohoto subscribera považovat za autentizovaného. Podrobně je tato autentizace předvedena v tomto videu https://www.youtube.com/watch?v=6IJKLsx_evw v čase od 10:45.

 

Event Grid vs. Event Hub

Přestože obě služby jsou zaměřeny primárně rozesílání zpráv o událostech, rozdíl mezi službami Event Grid a Event Hub spočívá především v jejich použití. Event Grid slouží k přeposílání jednotlivých událostí ve formě zpráv o velikosti v jednotkách kilobyte na definované subscribery a tudíž umožňuje těmto subsriberům na zaslané události reagovat prakticky v reálném čase. Event Hub je koncipován jako služba pro práci s většími objemy dat při nízké latenci. Na rozdíl od Event Gridu, jehož používání vychází levněji, je Event Grid využíván pro přijímání a ukládání konkrétních dat, a ne pouze pro publikování událostí. Event Hub slouží například pro účely telemetrie nebo logování a následnému přeposílání informací o těchto datech službám, které se zabývají jejich zpracováním.

Filip Štorek, MSWare s.r.o.

CFileDialog クラスにて表示したダイアログ ボックスの応答がなくなる問題について

$
0
0

こんにちは、Platform SDK (Windows SDK) サポートチームです。

今回は、MFC の CFileDialog クラスに関して確認されている問題についてご案内します。


現象

CFileDialog のコンストラクタにて bVistaStyle を FALSE に指定して表示させたファイル選択ダイアログ ボックスで SharePoint 上にあるフォルダを開くと、ダイアログボックスの応答が停止します。


CFileDialog クラス

https://msdn.microsoft.com/ja-jp/library/dk77e5e7.aspx


原因

この現象は、ファイル選択ダイアログの中で SharePoint サーバー上の Web 画面の表示をサポートするコンポーネントの不具合によって発生します。


回避策

以下のレジストリ設定を適用することにより、ファイル選択ダイアログの中の Web 画面の表示が無効となり、本問題を回避することができます。


- 現在のユーザーに対して適用する場合

キー : HKEY_CURRENT_USERSoftwarePoliciesMicrosoftWindowsExplorer

名前 : NoHTMLViewForWebDAV

種類 : REG_DWORD (32ビット)

値 : 1


- コンピューターにログオンするすべてのユーザーに適用する場合

キー : HKEY_LOCAL_MACHINESoftwarePoliciesMicrosoftWindowsExplorer

名前 : NoHTMLViewForWebDAV

種類 : REG_DWORD (32ビット)

値 : 1


状況

マイクロソフトでは、この現象について調査しています。
進展があり次第、本ブログを更新予定です。

Release Notes for Field Service and Project Service Automation Update Release 6 on Dynamics 365 version 8.2

$
0
0

Applies to: Field Service for Dynamics 365, Project Service Automation for Dynamics 365, and Universal Resource Scheduling (URS) solution on Dynamics 365 8.2.x

We’re pleased to announce the latest update to the Field Service and Project Service Automation applications for Dynamics 365. This release includes improvements to quality, performance, and usability, and is based on your feedback and requests.

This release is compatible with Dynamics 365 8.2.x. To update to this release, visit the Admin Center for Dynamics 365 online, solutions page to install the update. For details, refer How to Install, Update a Preferred Solution

Field Service enhancements (v6.2.4.6) Enhancements

Improvements 

  • GDPR compliance

Project Service Automation (v1.2.4.6) Enhancements

Improvements 

  • GDPR compliance

 

Bugs

  • Fixed: Unable to assign the Account record from account information form of Account entity
  • Fixed: Corrupted Calendar Rules from Setting Calendar
  • Fixed: Project Estimates do not display information of categories for line tasks
  • Fixed: Booking plugin cause performance downgrade when create project team member in PSA
  • Fixed: "Amount" in Expense entry can't exceed 3 digits figures when language is set to Finnish
  • Fixed: Time Entry Paste (Ctrl-C + Ctrl-V) msdyn_date time stamp is not set to 12:00pm UTC in classic calendar experience.

 

Universal Resource Scheduling Enhancements

NOTE: Improvements and bug fixes for Universal Resource Scheduling apply to Field Service and Project Service Automation, as well as to other schedulable entities in the Sales or Service applications.

Improvements 

  • Insert day of week next to date on hourly board

 

Insert day of week

 

Bugs 

 

  • Fixed: Corrupted Calendar Rules from Setting Calendar
  • Fixed: Using schedule assistant/find availability to create booking not populate lat/long info from resource requirement

 

For more information:

 

Feifei Qiu

Program Manager

Dynamics 365, Field & Project Service Team

14 апреля. Состоится российский финал международного технологического конкурса студенческих проектов Imagine Cup 2018

$
0
0

Imagine Cup — международный технологический конкурс студенческих проектов, запущенный в 2003 году. И в этом году в 16-й раз мы собираем лучших студентов, чтобы выбрать победителя!

Пятнадцать лучших проектов со всей страны представят свои решения в сферах искусственного интеллекта, больших данных и смешанной реальности. Ведущие эксперты из самых разных сфер — ИТ, космические разработки, HR, медиа — расскажут, каким будет наше будущее. Самые новые технологии, выставка девайсов, конкурсы и многое другое — все это ждет тебя 14 апреля в Digital October.

Приходите поболеть за понравившуюся команду, а также поучаствовать в увлекательных мастер-классах, послушать выступления ведущих экспертов из самых разных индустрий и пройти сертификацию по технологиям!

Что вас ждет на Imagine Cup?

  • Мероприятие будет проходить в формате Student Day, и мы сделаем все возможное, чтобы ваше участие в Imagine Cup было максимально полезным и интересным.
  • Вы можете стать активным болельщиком и проголосовать за любимую команду в рамках зрительского голосования.
  • На мероприятии вы встретите много известных блогеров, которые также придут поболеть за понравившиеся команды.
  • У вас будет уникальная возможность посетить 15 разнообразных докладов в TED-формате на актуальные для студентов темы.
  • Наши партнеры готовят специальные интерактивные игры в формате воркшопа, где вы сможете применить имеющиеся знания и получить новые.
  • Сотрудники HR-департамента Microsoft расскажут о том, как правильно строить карьеру в ИТ-компаниях и не только.
  • Вы сможете посетить выставку студенческих проектов, лично пообщаться с конкурсантами, а также вживую опробовать новейшие технологии на выставке партнеров.
  • Квесты, конкурсы, подарки и многое другое!

Участие бесплатное, регистрация обязательна!

Experiencing errors while creation of Application Insights app using Visual Studio – 04/02 – Mitigating

$
0
0
Update: Tuesday, 03 April 2018 22:24 UTC

Hotfix has been successfully deployed in Canary and BrazilUS regions. Currently, we are trying to prioritize this Hotfix rollout for other regions in the order of EastUS, SouthCentralUS, WEU, WUS2, Southeast Asia and NEU. Current ETA for Hotfix rollout across all regions is EOD Friday.
  • Work Around: Apps can be created using Azure portal without any issues
  • Next Update: EOD Friday

-Dheeraj


Update: Monday, 02 April 2018 21:53 UTC

We identified the root cause for this issue. To fix this, we are moving forward with Hotfix deployment in this order: EastUS, SouthCentralUS, WEU, WUS2, Southeast Asia, NEU. Currently we have no ETA on resolution and trying to expedite the rollout of this hotfix.
  • Work Around: Apps can be created using Azure portal without any issues
  • Next Update: Before 04/03 22:00 UTC

-Dheeraj


Initial Update: Monday, 02 April 2018 16:55 UTC

We are aware of the issues within Application Insights and are actively investigating. Customers creating a new project with Application Insights on by default in Visual Studio 2015 will see a failure message as below-

'Could not add Application Insights to project. Could not create Application Insights Resource : The downloaded template from 'https://go.microsoft.com/fwlink/?LinkID=511872' is null or empty. Provide a valid template at the template link. Please see https://aka.ms/arm-template for usage details. This can happen if communication with the Application Insights portal failed, or if there is some problem with your account.'


  • Work Around:  Apps can be created using Azure portal without any issues
  • Next Update: Before 04/02 21:00 UTC

We are working hard to resolve this issue and apologize for any inconvenience.


-Sapna

ODBC Driver 17.1 for SQL Server Released

$
0
0

We are pleased to announce an update to the Microsoft ODBC Driver 17 for SQL Server!  Version 17.1 brings a couple of added features and several fixed issues.

Added

  • Support for SQL_COPT_SS_CEKCACHETTL and SQL_COPT_SS_TRUSTEDCMKPATHS connection attributes
  • Azure Active Directory Interactive Authentication Support

Fixed

This release also contains the following fixed issues:

  • Fixed 1-second delay when calling SQLFreeHandle with MARS enabled and connection attribute "Encrypt=yes"
  • Fixed an error 22003 crash in SQLGetData when the size of the buffer passed in is smaller then the data being retrieved (Windows)
  • Fixed truncated ADAL error messages
  • Fixed a rare bug on 32-bit Windows when converting a floating point number to an integer
  • Fixed an issue where inserting double into decimal field with Always Encrypted on would no return data truncation error
  • Fixed a warning on MacOS installer

Next steps

For Windows installations, you can download version 17.1 of the Microsoft ODBC Driver 17 for SQL Server here.

Linux and macOS packages are also available. For installation details see the online instructions.

Roadmap

We are committed to improving quality and bringing more feature support for connecting to SQL ServerAzure SQL Database and Azure SQL DW through regular driver releases. We invite you to explore the latest the Microsoft Data Platform has to offer via a trial of Microsoft Azure SQL Database or by evaluating Microsoft SQL Server.

David Engel


Docker Blog Series Part 6 – How to use Service Fabric Reverse Proxy for container services

$
0
0

Service Fabric is a distributed systems platform that makes it easy to package, deploy, and manage scalable and reliable microservices and containers. Service Fabric also addresses the significant challenges in developing and managing cloud native applications. It is also an orchestrator of services across a cluster of machines and it is continuing to invest heavily in container orchestration and management. In this blog post, we will check out how to use Service Fabric Reverse Proxy for container services.

Container services are encapsulated, individually deployable components that run as isolated instances on the same kernel to take advantage of virtualization that an operating system provides. Thus, each application and its runtime, dependencies, and system libraries run inside a container with full, private access to the container's own isolated view of operating system constructs. Microservices running inside containers in Service Fabric run on a subset of nodes. Service Fabric orchestration is responsible for service discovery, resolution and routing. As a result, the endpoint for the services running inside the container can change dynamically.

If the container services are deployed in Azure Service Fabric, we could use Azure Load Balancer to reach the services from outside. To make these services/endpoints accessible to external clients, we configure the Load Balancer to forward traffic to each port that the service uses. This approach works but requires each service to be configured on Load Balancer.

Other approach could be to use Reverse proxy. Instead of configuring ports of individual service in Load Balancer, we configure the port of the Reverse proxy.  Following is the specific URI format, Reverse proxy uses for service discovery.

http(s)://<Cluster FQDN | internal IP>:Port/<MyApp>/<MyService>

 

image

 

One of the challenges when using Reverse proxy approaches is when you configure Reverse proxy port in the Load Balancer, all the services in the cluster that expose an HTTP endpoint are addressable from outside the cluster. Due to this reason, we recommend to use SF Reverse proxy for internal services.

But if you have mix of microservices which includes internal and external services, we would have to expose the reverse proxy port in the load balancer to expose external services. But that would implicitly also expose our internal services which we dont want to. So a workaround for now would be to use the below port configuration in the Docker compose file. We will be explicitly dropping the HTTP prefix from the ports. This will ensure that those container services which have this configuration will not be exposed. This will make sure that the internal container service with this configuration will not be exposed through the Load Balancer.

You can check out my previous blog below to see how to do Docker – Compose deployment for Service Fabric.

image

As you saw above, you can leverage Azure Service Fabric Reverse proxy to expose different types of services. To read more about Azure Service Fabric proxy check out the following links.

https://docs.microsoft.com/en-us/azure/service-fabric/service-fabric-reverseproxy

https://docs.microsoft.com/en-us/azure/service-fabric/service-fabric-reverse-proxy-diagnostics

https://docs.microsoft.com/en-us/azure/service-fabric/service-fabric-reverseproxy-configure-secure-communication

4/11 Webinar: Load Testing Power BI Applications with Visual Studio Load Test

$
0
0

Load Testing SAS based service such as Power BI Premium capacity can be quite complex with three primary strategies:

  1. Manual Testing using something Fiddler or F12 Tools to identify (and then optimize) slows calls
  2. Using a UI automation testing tool like Visual Studio Coded UI Test that replay tests on the Power BI user interface  Note: A web only framework like Selenium will not work as it can not enter the credentials in the login challenge response.
  3. Generate load on the Power BI Premium capacity using protocol level traffic.

Load Testing Power BI has the added complexity in that it is a very secure platform and security information is sent on every request and this information must be inserted into the web test calls generating the traffic.  Geoff Gray, of Gray Test Consulting will be walking through this complex process using Visual Studio Testing tools such as Visual Studio Web and Performance Test and Visual Studio Load Test.

When: 4/11/2018 10 AM PST

Where: https://www.youtube.com/watch?v=UFbCh5TaR4w

 

 

About Gray Test Consulting

I am a Senior Load and Performance Test Consultant with a degree in Computer and Electrical Engineering and extensive experience helping companies in the IT industry improve their applications’ stability, maintainability and performance while also teaching them how to manage these environments moving forward.

I specialize in short-term, mission-critical, high-visibility efforts, navigating through unknown issues and driving discovery and remediation of complex issues across multiple teams to meet tight deadlines. I also focus heavily on training and knowledge-transfer, ensuring that all of my work and processes are clearly understood by the people I am helping. I believe strongly in “teaching people how to fish.” Much of my experience was earned during my 24+ years working for Microsoft in technical delivery roles.

I have worked with more than 200 customers across several verticals in both private and public sectors, including:

  • Airlines (Alaska Air, United)
  • Automotive (Ford, GM, and others)
  • Banking (Bank of America, Citi, Wells Fargo, and others)
  • Construction/Engineering (Bentley, Dupont, Gulfstream, Kiewit, Schlumberger, and others)
  • eCommerce/Retail (Amway, BestBuy, Hanes, Publix, Target, Total Wine, and others)
  • Financial/Tax (BDO, Ernst and Young, FNF, PWC, and others)
  • Health Care (Abbott, AllScripts, Greenway, McKesson, NextGen, Philips, and others)
  • Hospitality (Disney Magic Express, Walt Disney, and others)
  • Insurance (Aetna, AllState, Geico, Progressive, State Farm, and others)
  • Investment (Merrill Lynch, Raymond James, Smith Barney, and others)
  • ISV/Software (Accenture, Avanade, ECI, Microsystems, Symantec, and others)
  • Education (Denver Public Schools, Laureate, Meridian School District, and others)
  • Public Sector (Air Force, IRS, NHTSA, Veterans Administration, and others)

 

 

 

タイムアウトエラーについて (MQTT / IoT Device SDK for Java)

$
0
0

今回は IoT Device SDK for Java で MQTT プロトコルを利用した時のタイムアウト値について説明いたします。
Device SDK for Java における MQTT プロトコルでは、IoT Hub とのキープアライブのタイムアウト値として 230 秒という値が設定されております。このキープアライブタイムアウト値はハードコードされており、デバイス アプリ側からは変更することができません。また、キープアライブのインターバル値も 230 秒で、こちらも変更できません。
これは SDK の MqttConnection.java に定義されています。

    //mqtt connection options
    private static final int KEEP_ALIVE_INTERVAL = 230;

ソースコードはこちら
https://github.com/Azure/azure-iot-sdk-java/blob/master/device/iot-device-client/src/main/java/com/microsoft/azure/sdk/iot/device/transport/mqtt/MqttConnection.java

タイムアウトが発生すると、次のようなエラーメッセージが出力されます。

Timed out as no activity, keepAlive=230,000 lastOutboundActivity=2,623,682,362,159 lastInboundActivity=2,623,672,132,304 time=2,623,672,592,158 lastPing=2,623,672,362,159 [水 03 27 12:53:02 JST 2020]

タイムアウトが発生するまでの流れは、次の図のようになります。
image
図を見ていただければ解る通り、タイムアウト値は 230 秒 (3分50秒) ですがキープアライブ間隔も 230 秒です。キープアライブの応答が得られなかった場合 (図中の「応答時間」に依存しますが) 7分以上は接続が確認できなかった状況となります。

一度タイムアウトが発生して接続が切断された場合でも、通常再接続処理がすぐに行われます。再接続処理に失敗する場合、何らかの理由によりデバイスやネットワークの接続状態に問題がある可能性があります。デバイスやネットワーク側に問題がなく IoT Hub 側の問題が疑われる場合は弊社までお問い合わせください。

Azure IoT 開発サポートチーム 祝田

New eBook on strengthening cybersecurity for the DoD

$
0
0

The rapid evolution of technology and the exploitation of data, machine intelligence, and on-demand computing power is changing the requirements of cybersecurity. Fortunately, the tools available to the DoD are changing, too.

Despite improvements in defensive capabilities, prevailing research still shows that, on average, it takes agencies more than 200 days to detect a significant breach. Destructive cyberattacks, insider threats, and third-party dependencies have become a real threat to critical infrastructure, supply chains, and, potentially, to life safety.

In the face of increasingly complex attack methods and the growing expense of attack clean-ups, agencies need to understand the value of working with a leading cloud security partner to protecting sensitive assets from cyberthreats.

What are the benefits of a leading cloud service provider?

  • Big data threat analysis. As many large cloud providers host millions of servers, they can offer more extensive advanced threat analysis and monitoring than any on-premises capability.
  • Secure facilities. Cloud service providers specialize in keeping data safe both physically through secure facilities and virtually through in-built encryption, retaining highly skilled personnel to do so.
  • Modern platforms. Cloud service providers continually update and patch their platforms, without effect on customer workloads.
  • Auditing. Cloud service providers subject their platforms to thorough and frequent auditing to protect against flaws in their security posture.

Learn more about the importance of choosing the right cloud security partner:

 

Strengthening cybersecurity for the Department of Defense (eBook)

 

 

 

We welcome your comments and suggestions to help us improve your Azure Government experience. To stay up to date on all things Azure Government, be sure to subscribe to our RSS feed and to receive emails by clicking “Subscribe by Email!” on the Azure Government Blog.

Pokročilejší transformace volání v Azure API Management

$
0
0

O Azure API Management už jsem psal a dnes si ukážeme příklad o něco složitějších transformací. Máte API, které není RESTful a připomíná spíše JSONové RPC přes http? Podívejme se, jak se dá transformovat na moderní API, které bude každému vývojáři vyhovovat.

Představme si následující situaci a reálné nejmenované API, s kterým jsem si hrál. To využívalo na všechno metodu POST, zatímco RESTful přístup využívá vícero http sloves (GET pro čtení, POST pro vytvoření, PUT pro vytvoření nebo modifikaci, DELETE pro výmaz a PATCH pro jednoduchou modifikaci). Samotné sloveso bylo v URL, například /get/device, /create/device nebo /delete/device. Dále přihlašovací údaje, token nebo verze API bylo přímo v JSON s dotazem, který byl součást body, zatímco u RESTful tohle typicky chcete v headeru. Samotný dotaz byl formulován jako JSON v body – to je u RESTful běžné pro POST nebo PUT, ale čtení a filtrování typicky řešíte jako query v GETu. API vždy vracelo 200 z HTTP hlediska a uvnitř response body byl JSON obsahující položku error_code. V RESTful přístupu chcete využít http kódů přímo v odpovědi jako je 200 pro OK, 201 pro vytořeno, 202 pro přijato, 400 pro špatně formulovaný dotaz, 401 při selhání autentizace nebo 403 při selhání autorizace (uživatel je v pořádku, ale k této informaci nemá přístup).

Jak by se dalo toto API transformovat na RESTful s využitím Azure API Management?

Liquid syntaxe v set-body

Nejprve samozřejmě nadefinujeme novou podobu API, ale při volání na backend musíme provést transformace. Prvním způsobem při vytváření potřebného JSON je Liquid syntaxe (podobná například Jinja2 template v Pythonu).

Mějme na vstupu GET namířený na /devices, definovaný parametr count a offset a definovaný header s tokenem. Jak tohle dostat do POST volání a zakomponovat do JSON v body tak, jak to starší API potřebuje?

Pokračovat ve čtení

Azure Active Directory Authentication (Easy Auth) with Multiple Backend APIs

$
0
0

This blog post is a continuation of a scenario that I discussed previously in my blog post on Azure Active Directory Authentication with Custom Backend Web API. I also expanded on that post with some notes on local debugging of apps using Easy Auth. If you didn't read those posts, you may want to start by browsing through them to make it easier to follow along.

I discussed how you can use the access token obtained in an Azure Web App using App Service Authentication (a.k.a. Easy Auth) to access a backend API. The basic principle is to use the resource=APP-REGISTRATION-ID-OR-URI parameter when authorizing with Azure Active Directory. For example, a call like (with additional parameters):

POST https://login.microsoftonline.com/TENANT-NAME.onmicrosoft.com/oauth2/authorize?resource=https://graph.microsoft.com&response_type=id_token code
 

would start the authentication flow and ultimately lead to the web application obtaining a token that can be used to access the Microsoft Graph API. App Service Authentication (Easy Auth) automates this for you and makes the tokens available to the application without the need for any authentication specific code in the application. In the previous post, we set the resource specification to be the ID of an Azure App Registration used to secure access to a custom backend API. At the end of walking through the scenario, I hinted at a potential problem. What happens if you would like to access multiple backend APIs. A given token can only be used with one API (unless they use the same app registration). In the following I will show that you can obtain tokens for any additional APIs using an on-behalf-of flow. As a specific example, I will demonstrate how to access a custom API (the List API from the previous example) and the Microsoft Graph API. The scenario would look something like this:

 

 

The Graph API could be used to access documents in the users Office 365 or look for information in Azure Active Directory. The link to Office 365 in the illustration is just an example.

The first step is to add Graph API to the required permissions of the app registration:

In this case, we have somewhat arbitrarily added 3 delegated permissions on the Graph API, but you can decide which are appropriate for your application. Please consult the previous blog post for details on how to set up the app registration and configuring App Service Authentication.

Next we will add functionality to the application to access the Graph API. You can access the updated application code on GitHub. Compared to the previous version of the application, I have introduced the EasyAuthProxy to make it easier to do local debugging of the code as described in a previous blog post. The functionality for accessing the Graph API can be found in GraphController.cs:

using System;
using System.Collections.Generic;
using System.Net.Http;
using System.Net.Http.Headers;
using System.Threading.Tasks;
using ListClientMVC.Models;
using ListClientMVC.Services;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Extensions.Configuration;
using Newtonsoft.Json;

namespace ListClientMVC.Controllers
{
    public class GraphController : Controller
    { 

        private IEasyAuthProxy _easyAuthProxy;
        private IConfiguration _configuration;

        public GraphController(IEasyAuthProxy easyproxy, IConfiguration config) {
            _easyAuthProxy = easyproxy;
            _configuration = config;
        }

        public async Task<IActionResult> Index()
        {
            //Get a token for the Graph API
            string id_token = _easyAuthProxy.Headers["x-ms-token-aad-id-token"];
            string client_id = _configuration["AADClientID"];
            string client_secret = _configuration["AADClientSecret"];
            string aad_instance = _configuration["AADInstance"];

            var client = new HttpClient();

            var content = new FormUrlEncodedContent(new[]
            {
                new KeyValuePair<string, string>("grant_type", "urn:ietf:params:oauth:grant-type:jwt-bearer"),
                new KeyValuePair<string, string>("assertion", id_token),
                new KeyValuePair<string, string>("requested_token_use", "on_behalf_of"),
                new KeyValuePair<string, string>("scope", "User.Read"),
                new KeyValuePair<string, string>("client_id", client_id),
                new KeyValuePair<string, string>("client_secret", client_secret),
                new KeyValuePair<string, string>("resource", "https://graph.microsoft.com"),
                                
            });

            var result = await client.PostAsync(aad_instance + "oauth2/token", content);
            string resultContent = await result.Content.ReadAsStringAsync();

            if (result.IsSuccessStatusCode) {
                //Call Graph API to get some information about the user
                TokenResponse tokenResponse= JsonConvert.DeserializeObject<TokenResponse>(resultContent);

                client.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue("Bearer", tokenResponse.access_token);
                var response = await client.GetAsync("https://graph.microsoft.com/v1.0/me");
                var cont = await response.Content.ReadAsStringAsync();
                ViewData["me"] = cont;
            } else {
                ViewData["me"] = "Failed to access MS Graph";
            }

            return View();
        }
    }
}

For simplicity, I have added all the required code for getting a token and passing it to the Graph API in a single function. In a more practical application, you would probably add this functionality in some sort of service that may include some caching of tokens and manage refreshing of expiring tokens. The code goes through two phases: 1) a call to https://login.microsoftonline.com/TENANT.onmicrosoft.com/oauth2/token to obtain a token and 2) a call to the Graph API where the token is passed as a bearer token in an authorization header.

The first time the application is accessed, it will ask for delegated permissions. Compared to the previous application, the user is now asked for Graph permissions in addition to the List API:

After granting permissions, it is possible to access the updated version of the application:

As shown above, there are now a couple of new menu items, one for accessing the list through the backend List API and one for accessing the Microsoft Graph API.

And that is it, we have created an application that uses Easy Auth to authenticate the users. After authentication, the user can access a backend List API directly using the token obtained as part of the Easy Auth flow and the user will also be able to access the Microsoft Graph API by having the application obtain a token "on-behalf-of" the user.

If you prefer to handle all user authentication in your application code, you can use OpenID Connect. Please see this example for a very similar flow using OpenID Connect and a custom backend API. For completeness, I have implemented made another version of the ListClient, which uses OpenID Connect. Please refer to that code repository for details.

Let me know if you have questions/comments/suggestions.

APRIL 2018 EDITION – HOT SHEET PARTNER TRAINING SCHEDULE

$
0
0

Welcome to the Australian Partner Community Hot Sheet, a comprehensive schedule of partner training, webcasts, community calls, and office hours. This post is updated frequently as we learn about new offerings, so you can plan ahead of time.

If you have any questions/feedback about any of the training below or would like to suggest training for us to run, please email the Microsoft Australia Readiness Team at msaupr@microsoft.com.


MICROSOFT PARTNER NETWORK NEWS


LEARNING NEWS


SPOTLIGHT

Australian New Partner Orientation Community Call

Type: Beginners

Audience: Anyone

Cost: No Cost

Date & Locations: Skype (April 26), (May 24), (June 28)

On this call, which will last about 45 minutes, we take you through a demo of the Microsoft Partner Network to help you navigate where to find information to support your business. We will also answer questions from attendees. Join the call to learn about your options for membership in the Microsoft Partner Network, the associated benefits you receive, and where to find the right support and assets. Our MPN 101 resource explains program basics and provides you with links to get further details about your areas of interest. REGISTER HERE


MICROSOFT AZURE

Implementing Microsoft Azure Infrastructure Solutions

Type: Technical (L300)

Audience: IT Professional

Cost: $761

Product: Microsoft Azure

Date & Locations: Sydney, Canberra, Melbourne (April 9-13); Brisbane, Adelaide, Perth (April 16-20)

This course teaches IT professionals how to provision and manage services in Microsoft Azure. Students will learn how to implement infrastructure components such as virtual networks, virtual machines, containers, web and mobile apps, and storage in Azure. Students also will learn how to plan for and manage Azure AD and configure Azure AD integration with on-premises Active Directory domains. REGISTER HERE

Microsoft Azure Fundamentals

Type: Technical (L100)

Audience: IT Professional

Cost: $367

Product: Microsoft Azure

Date & Locations: Sydney, Canberra, Melbourne (May 21-22)

This course will introduce students to the principles of cloud computing. Students will become familiar with how these principles have been implemented in Microsoft Azure. In addition, this course will explain how to implement the core Azure infrastructure, consisting of virtual networks and storage. With this foundation, students will learn how to create the most common Azure services, including Azure Virtual Machines, Web Apps, and Azure SQL Database. The course will conclude by describing features of Azure AD and methods of integrating it with on-premises Active Directory. REGISTER HERE

Integrating On-Premises Identity Infrastructure with Microsoft Azure

Type: Technical (L300)

Audience: IT Professional / Development Operations

Cost: $761

Product: Microsoft Azure

Date & Locations: Sydney, Canberra, Melbourne, Adelaide (April 30 – May 4)

This 5-day instructor-led course covers a range of components, including Azure Compute, Azure Storage, and network services that customers can benefit from when deploying hybrid solutions. In this context, the term hybrid means integrating infrastructure technologies that customers host in on-premises datacentre's with Azure IaaS and PaaS services. This course offers an overview of these services, providing the knowledge necessary to design hybrid solutions properly. It also includes several demonstrations and labs that enable students to develop hands-on skills that are necessary when implementing such solutions. REGISTER HERE

Architecting Microsoft Azure Solutions

Type: Technical (L300)

Audience: IT Professional / Architects

Cost: $761

Product: Microsoft Azure

Date & Locations: Sydney, Canberra, Melbourne (May 7-11); Brisbane, Adelaide, Perth (May 14-18)

This course is intended for architects who have experience building infrastructure and applications on the Microsoft Azure platform. Students should have a thorough understanding of most services offered on the Azure platform. The students typically work for organizations that have an active solution on Azure and are planning to enhance existing solutions or deploy more solutions to the Azure platform. This course also is intended for architects who want to take the Microsoft Certification exam, 70-535, Architecting Microsoft Azure Solutions. REGISTER HERE

Operationalize Cloud Analytics Solutions with Microsoft Azure

Type: Technical (L300)

Audience: IT Professional / Architects

Cost: $367

Product: Microsoft Azure

Date & Locations: Sydney, Melbourne, Perth (April 26-27)

This course is a two-day instructor-led course intended for data professionals who want to expand their knowledge about creating big data analytic solutions on Microsoft Azure. Students will learn how to operationalize end-to-end cloud analytics solutions using the Azure Portal and Azure PowerShell. REGISTER HERE

Configuring and Operating a Hybrid Cloud with Microsoft Azure Stack

Type: Technical (L300)

Audience: IT Professional

Cost: $761

Product: Microsoft Azure

Date & Locations: Sydney, Melbourne, Brisbane (May 28 – June 1)

This five-day course will provide students with the key knowledge required to deploy and configure Microsoft Azure Stack. REGISTER HERE

Developing Microsoft Azure Solutions

Type: Technical (L400)

Audience: IT Professional

Cost: $630

Product: Microsoft Azure

Date & Locations: Sydney, Canberra, Adelaide (June 4-7); Perth, Brisbane, Melbourne (June 12-15)

This course is intended for students who have experience building ASP.NET and C# applications. Students will also have experience with the Microsoft Azure platform and a basic understanding of the services offered. This course offers students the opportunity to take an existing ASP.NET MVC application and expand its functionality as part of moving it to Azure. This course focuses on the considerations necessary when building a highly available solution in the cloud. REGISTER HERE

Azure Skills Initiative

Increase your Azure skills – free training and discounted certification offers now available for Microsoft customers and partners. Since December 2016, there have been over 84,000 enrolments in the first 15 Azure courses that we have released on Open edX on Azure. These courses are technical, in-depth (typically 10-20 hours of content each, depending on the learner’s level of engagement), and hands-on, with a mix of content types and interactivities. Learners who do well enough on assessments, and in practical exercises like labs, achieve a passing score that validates targeted knowledge and skills, and they receive a digital certificate that can be shared in social profiles and elsewhere online. GET STARTED TODAY!


DATA ANAYTICS

Azure Big Data Analytics Solutions

Type: Technical (L200)

Audience: IT Professional

Cost: $367

Product: Microsoft Azure

Date & Locations: Sydney, Canberra, Melbourne (April 23-24)

This is a two-day instructor-led course is intended for data professionals who want to expand their knowledge about creating big data analytic solutions on Microsoft Azure. Students will learn how to design solutions for batch and real-time data processing. Different methods of using Azure will be discussed and practiced in lab exercises, such Azure CLI, Azure PowerShell and Azure Portal. REGISTER HERE

Architecture Big Data Analytics

Type: Technical (L300)

Audience: IT Professional / Architects

Cost: $699

Product: Microsoft Azure

Date & Locations: Sydney (May 7-9): Melbourne (May 21-23); Perth (June 11-13)

The Azure Big Data and Analytics Bootcamp is designed to give students a clear architectural understanding of the application of big data patterns in Azure. Students will be taught basic Lambda architecture patterns in Azure, leveraging the scalability and elasticity of Azure in Big Data and IoT solutions. An introduction to data science techniques in Azure will also be covered. REGISTER HERE

Cognitive Services – Learning Paths

AI: Optical Character Recognition using Cognitive Services

This intermediate-level Artificial Intelligence learning path provides an overview of the Cognitive Services - OCR API and shows how to build an application using it. START IT NOW

AI: Image Classification using Cognitive Services

This intermediate-level Artificial Intelligence learning path provides an overview of the Cognitive Services - Custom Vision API and shows how to build an application using it. START IT NOW

Machine Learning – Learning Paths

AI: Custom Deep Neural Network models for OCR

This advanced-level Artificial Intelligence learning path provides instruction on how to build a deep neural network for recognizing characters/digits in an image. START IT NOW

AI: Custom Deep Neural Network Models for Object Recognition

This advanced level Artificial Intelligence learning path provides instruction on how to build a custom Deep Neural network for object detection and classification. START IT NOW


MICROSOFT 365

Microsoft Security & Compliance Practice Enablement

Type: Technical (L200)

Audience: IT Professional

Cost: $699

Product: Office 365 / Windows / Security / EMS

Date & Locations: Brisbane (April 30 – May 2); Sydney (May 7-9); Melbourne (May 14-16)

This 3-day, instructor-led course offers technical training covering Security and Compliance topics including: Identity & Access Management, Enterprise Level Identity Protection, Proactive Attack Detection & Prevention, Controlling & Protecting Information, and GDPR and Regulatory Compliance. This content is delivered through lectures, case studies, videos, demos, and hands on labs. This unique event compliments those by offering technical training intended to help Microsoft partners understand, deploy and manage the inter-related mix of technologies enabling today’s modern workplace to secure data and comply with regulation while function efficiently in a cloud and mobile dominated world. REGISTER HERE

Master Microsoft 365. Take training to build a practice that takes your business to the next level

  • Microsoft 365 Business Overview: Discover how Microsoft 365 Business can help your customers improve their productivity and protect their data from security threats. Learn More Today
  • Cloud Voice: See how this solution provides services, security, and support that traditional phone lines can’t match. Explore Cloud Voice
  • Security & Compliance: Learn how Microsoft 365 helps organisations with content security and data usage compliance. Explore Security & Compliance
  • Collaboration: Discover how Microsoft 365 solutions help your customers collaborate across their organisation. Explore Collaboration
  • Microsoft 365 powered device: Find out how you make device security top priority, while easing IT transition to cloud-based management. Explore Microsoft 365 powered device

MICROSOFT DYNAMICS

Dynamics 365 customer engagement Enterprise Edition University for Sales Professionals

Type: Sales (L200)

Audience: Sales Professionals

Cost: $1,600 USD (Use code UsP299008  to receive a discounted rate)

Product: Dynamics 365

Date & Locations: Sydney (April 30 – May 2); Melbourne (May 7-9)

This course is about learning how to improve the way you sell Microsoft Dynamics 365 and related Microsoft solutions. SYDNEY REGISTER HERE, MELBOURNE REGISTER HERE

Dynamics 365 for Finance and Operations, Retail and Talent University for Sales Professionals

Type: Sales (L200)

Audience: Sales Professionals

Cost: $1600 USD (Use code UsP299008  to receive a discounted rate)

Product: Dynamics 365

Dates & Locations: Sydney (May 2-4); Melbourne (May 9-11)

This course is designed to describe the best practices for selling the Microsoft Dynamics 365 solution into a variety of industries. It will cover the key industries that the solution is designed for and how to position the product suite in each of the industries. Additionally, attendees of this course will learn how to deliver the Microsoft cloud story and explain the value proposition and key messaging for Dynamics 365 and the related Microsoft solutions such as Power BI, PowerApps, Microsoft Flow, and Office 365.

SYDNEY REGISTER HERE, MELBOURNE REGISTER HERE

Advanced Dynamics 365 Business Central Trade for Consultants

Type: Technical (L300)

Audience: Application Consultants

Cost: $700 USD

Product: Microsoft Dynamics NAV

Date & Locations: Skype for Business May 15-18 12:00am – 03:00am

This course provides knowledge and insight into the trade and inventory management application area in Microsoft Dynamics 365 Business Central. The focus is on the most important trade related settings and functions, such as selling and purchasing items and services, controlling inventory, item price and discount management, and requisition management. It also covers important inventory functions such as item tracking, assembly management and location transfers. Please note you must have basic experience with Microsoft Dynamics NAV. REGISTER HERE

Building Apps with Dynamics 365 Business Central

Type: Technical (L300)

Audience: developers, solution architects and technical consultants

Cost: $700 USD

Product: Microsoft Dynamics NAV

Date & Locations: Skype for Business May 29 – June 9 12:00am – 03:00am

The Building Apps with Dynamics 365 Business Central course is designed for helping solution architects and developers on designing and developing extensions for Dynamics 365 Business Central. Participants will get an overview and in-depth information about the technical aspects involved in designing a great app or extension. Please note students must have Dynamics 365 Business Central or Microsoft Dynamics NAV implementation and architecture experience and be familiar with Dynamics 365 Business Central or Microsoft Dynamics NAV development topics. REGISTER HERE

Advanced Dynamics 365 Business Central Finance for Consultants

Type: Technical (L300)

Audience: application consultants

Cost: $700 USD

Product: Microsoft Dynamics NAV

Date & Locations: Skype for Business June 5-12 12:00am – 03:00am

This course provides knowledge and insight into the financial management application area in Microsoft Dynamics 365 Business Central. The focus is on both basic and more advanced financial functions within the organization such as the chart of accounts, cash management, processing invoices, OCR, cash flow forecasting, cost accounting, financial reporting, and the year-end closing process. Students must have basic experience with Microsoft Dynamics NAV. REGISTER HERE


AUSTRALIAN PARTNER SELLER MEMBERS TRAINING

Advanced Technical Training for Cloud and Data Platform Architects

Type: Technical (L400)

Audience: Only open to registered Partner Sellers

Cost: $1,800 USD

Duration: Starts: April 16-20, 2018

Location: Hyatt Regency Hotel | Bellevue, WA, USA

In an increasingly digital world, people, not technology will disrupt the status quo. As an IT pro working amid the opportunity of digital transformation, you’re going to be asked to do things differently. You’ll need deep skills, expertise, and confidence to implement meaningful technical solutions that meet advanced customer needs. We’re investing big in helping you develop that skill set with the Intelligent Cloud Architect Boot Camp. REGISTER HERE


MICROSOFT GLOBAL FLAGSHIP EVENTS

MICROSOFT BUILD 2018

Audience: Developers

Cost: US$2,495

Duration: 3 days

Location: Seattle, Washington State (May 7-9, 2018)

Get an inside look at the tools and platforms that matter to you most, including: Azure, Windows, Visual Studio, Microsoft 365, and more. Build across any app, language, and platform, Microsoft Build is the event for all developers. REGISTER HERE

MICROSOFT INSPIRE 2018

Audience: Business Professionals, CEO, General Managers, Sales Directors, Marketing Professionals

Cost: US$2,295

Duration: 5 days

Location: Las Vegas, Nevada (July 15-19, 2018)

Join us at the 2018 Microsoft Inspire as we redefine productivity in a mobile-first, cloud-first world. Microsoft Inspire is the largest partner event of the year—a place to connect with fellow partners and with Microsoft, hear about the future direction of Microsoft and the broader IT industry, discover new business opportunities, learn, teach, and share with peers. With past attendance reaching 16,000 attendees from over 140 countries, no other event brings together so many of the most successful, top-tier partners in the Microsoft Partner Network. Microsoft Inspire is where leaders and visionaries of the IT industry gather to spur innovation and growth to new heights. Harness the potential of these great minds and build partnerships to expand the reach of your own solutions. Microsoft Inspire allows you to discover new opportunities and leverage industry connections to get business done. REGISTER HERE

MICROSOFT ENVISION 2018

Audience: Business Professionals, CEO, General Managers, Sales Directors, Marketing Professionals

Duration: 3 days

Location: Orlando Florida (September 24-26, 2018)

Microsoft Envision is the event that gathers entrepreneurial business leaders from across industries eager to gain the latest insights and transformative strategies to drive digital transformation. Exchange innovative ideas with fellow leaders on how to engage customers and employees, and leave inspired to elevate your model and expand your enterprise with cutting-edge technology. STAY UPDATED

 MICROSOFT IGNITE 2018

Audience: Business Professionals, CEO, General Managers, Sales Directors, Marketing Professionals

Cost: US$2,395

Duration: 5 days

Location: Orlando Florida (September 24-28, 2018)

Join us at Microsoft Ignite for the latest insights from technology leaders and practitioners shaping the future of cloud, data, business intelligence, productivity, and teamwork. Register now to get exclusive access to cutting-edge technologies, expert-led sessions, and valuable networking at Microsoft’s largest and most comprehensive event. REGISTER HERE


2017 MICROSOFT GLOBAL FLAGSHIP EVENTS (ON DEMAND)

MICROSOFT BUILD 2017

Audience: Developers

WATCH SESSIONS FROM MICROSOFT BUILD 2017.

MICROSOFT DATA INSIGHTS SUMMIT 2017

Audience: Business Professionals, CEO, General Managers, Sales Directors, Marketing Professionals

WATCH SESSIONS FROM MICROSOFT DATA INSIGHTS SUMMIT.

MICROSOFT INSPIRE 2017

Audience: Business Professionals, CEO, General Managers, Sales Directors, Marketing Professionals

WATCH SESSIONS FROM MICROSOFT INSPIRE 2017.

MICROSOFT IGNITE 2017

Audience: IT Professionals

WATCH SESSIONS FROM MICROSOFT IGNITE 2017.


Expression of Interest please email msaupr@microsoft.com with the course title in the subject line. Once the course opens we will email you the invite.


The MIPS R4000, part 3: Multiplication, division, and the temperamental HI and LO registers

$
0
0


The MIPS R4000 can perform multipliction and division in hardware,
but it does so in an unusual way, and this is where the temperamental
HI and LO registers enter the picture.



The HI and LO registers are 32-bit registers
which hold or accumulate the results of a multiplication or addition.
You cannot operate on them directly.
They are set by a suitable arithmetic operation,
and by special instructions for moving values in and out.



The multiplication instructions treat
HI and LO
as a logical 64-bit register,
where
the high-order 32 bits are in the HI register
and the low-order 32 bits are in the LO register.



MUL rd, rs, rt ; rd = rs * rt, corrupts HI and LO
MULT rs, rt ; HI:LO = rs * rt (signed)
MULTU rs, rt ; HI:LO = rs * rt (unsigned)


The simplest version is MUL which multiples two
32-bit registers and stores a 32-bit result into a general-purpose register.
As a side effect, it corrupts the HI and LO registers.
(This is the only multiplication or division operation that puts the result
in a general-purpose register instead of into
HI and LO.)



The MULT instruction multiplies two signed 32-bit values
to form a 64-bit result,
which it stores in HI and LO.



The MULTU instruction does the same thing,
but treats the factors as unsigned.



The next group of multiplication instructions performs accumulation.



MADD rs, rt ; HI:LO += rs * rt (signed)
MADDU rs, rt ; HI:LO += rs * rt (unsigned)
MSUB rs, rt ; HI:LO -= rs * rt (signed)
MSUBU rs, rt ; HI:LO -= rs * rt (unsigned)


After performing the appropriate multiplication operation,
the 64-bit result is added to or subtracted from the value currently
in the HI and LO registers.



Note that the U suffix applies to the signed-ness
of the multiplication, not to whether the operation traps on
signed overflow during addition or subtraction.
None of the multiplication instructions trap.



The operation runs faster if you put the smaller factor in rt,
so if you know (or suspect) that one of the values is smaller than the
other, you can try to arrange for the smaller number to be in rt.



You might think that the division operations take a 64-bit value
in HI and LO and divide it by a 32-bit register.
But you'd be wrong.
They divide a 32-bit value by another 32-bit value and store the
quotient and remainder in in HI and LO.



DIV rd, rs, rt ; LO = rs / rt, HI = rs % rt (signed)
DIVU rd, rs, rt ; LO = rs / rt, HI = rs % rt (unsigned)


None of the division operations trap,
not even for overflow or divide-by-zero.
If you divide by zero or incur division overflow, the results in
HI and LO are garbage.
If you care about overflow or division by zero,
you need to check for it explicitly.



Okay, that's great.
We've done some calculations and put the results into
HI and LO.
But how do we get the answer out?
(And how do you put the initial values in, if you are using
MADD or MSUB?)



MFHI rd ; rd = HI "move from HI"
MFLO rd ; rd = LO "move from LO"
MTHI rs ; HI = rs "move to HI"
MTLO rs ; LO = rs "move to LO"


The multiplication and division operations take some time to execute,¹
and if you try to read the results too soon, you will stall
until the results are available.
Therefore, it's best to distract yourself with some other operations
while waiting for the multiplication or division operation to do its thing.
(For example, you might check if you need to raise a runtime
exception because you just asked the processor to divide by zero.)



The temperamental part of the
HI and LO registers is in how you read
the values out.



Tricky rule number one:
Once you perform a MTHI or MTLO instruction,
both of the previous values in
HI and LO are lost.
That means you can't do this:



MULT r1, r2 ; HI:LO = r1 * r2 (signed)
... stuff that doesn't involve HI or LO ...
MTHI r3 ; HI = r3
... stuff that doesn't involve HI or LO ...
MFLO r4 ; r4 = GARBAGE


You might naïvely think that the MTHI replaces
the value in the HI register and leaves the
LO register alone,
but since this is the first write to either of the
HI or LO registers since the last
multiplication or division operation,
both registers are lost, and your attempt to fetch
the value of LO will return garbage.



Note that this applies only to the first write to HI
or LO.
The second write behaves as you would expect.
For example,
if you perform MTHI followed by MTLO,
the MTHI will set HI and corrupt LO,
but the MTLO will set LO and leave
HI alone.



Tricky rule number two:
If you try to read a value from HI or LO,
you must wait two instructions before performing any operation
that writes to
HI or LO.
Otherwise, the reads will produce garbage.
The instruction that writes to
HI or LO
could be a multiplication or division operation, or it could be
MTHI or MTLO.



Tricky rule number two
means that the following sequence is invalid:



DIV r1, r2 ; LO = r1 / r2, HI = r1 % r2 (signed)
... stuff that doesn't involve HI or LO ...
MFHI r3 ; r3 = r1 % r2 GARBAGE
MULT r4, r5 ; HI:LO = r4 * r5 (signed)


Since the MULT comes too soon after the
MFHI, the MFHI will put garbage
into r3.
You need to stick two instructions between the
MFHI and the MULT in order to avoid this.



(Tricky rule number two was removed in the R8000.
On the R8000, if you perform a multiplication or division or
MTxx
too soon after a MFxx,
the processor will stall until the danger window has passed.)



Okay, next time we'll look at constants.



¹
Wikipedia says that

latency of 32-bit multiplication was 10 cycles,
and latency of 32-bit division was a whopping 69 cycles
.

devconf’s log #5 – Sad it’s over, stoked over chats, and the 25h trek home

$
0
0

Continued from devconf’s log #4 – “Was zum Geier?”, lean coffee, and rain it’s time to close the curtains Sad smile and start the countdown to next year Smile 

Before I share the final pictures from the event, I’d like to THANK the hosts (Robert and Candice) for a phenomenal event and the speakers & attendees for great sessions and more importantly great discussions. Let’s keep the awesome chats going!

The event in Cape Town was my personal favourite for a number of reasons:

  • The day started with an energetic Lean Coffee event – missed the one in Johannesburg as I was in Zombie mode Alien
  • It was a smaller and cozier event – only three sessions to chose from at a time Thumbs up
  • We had a lot more discussions on DevOps and VSTS in the hallways Smile 
  • We brought the rain to drought-stricken Cape Town Open-mouthed smile

OK, here are a few more pictures:

Last traces of devconf.co.za in Cape Town

DSCN4193DSCN4194DSCN4210DSCN4211DSCN4192
My favourite water shortage sign

WP_20180329_006

The event hotel in Cape Town was an oasis

DSCN4201DSCN4202DSCN4203DSCN4195DSCN4196DSCN4197DSCN4199DSCN4200WP_20180330_009

Stunning Cape Town

DSCN4208DSCN4209DSCN4212

Sorry for the delay in this final post … the 25 hour trek (journey) from Cape Town to Vancouver really took it out of me.

If you have any questions or feedback on my Moving 65000 Microsofties to DevOps with Visual Studio Team Services talk, please add a comment below, or tweet me on @wpschaub.

See you next time!

DIXF on SQL 2014 – Cannot Preview source file – CommunicationException – An error occurred while receiving the HTTP response from DMFServiceHelper.svc

$
0
0

Consider a scenario where you are importing data using DIXF, and you get an error when you try to “Preview source file” and you are running SQL Server 2014 SP2 CU10 or CU11.

1. The error

The error you see looks something like this:

System.ServiceModel.CommunicationException: An error occurred while receiving the HTTP response to http://servernamehere:7000/DMFService/DMFServiceHelper.svc. This could be due to the service endpoint binding not using the HTTP protocol. This could also be due to an HTTP request context being aborted by the server (possibly due to the service shutting down).

We also see these sorts of events in the Windows Event Log:

Event 1026 - .NET Runtime - Application: Microsoft.Dynamics.AX.DMF.SSISHelperService.exe - The process was terminated due to an unhandled exception - exception code 1

Event 1000 - Application Error - Faulting application name: Microsoft.Dynamics.AX.DMF.SSISHelperService.exe, version:6.3.6000.3667 - Faulting module name: KERNELBASE.dll, version: 6.3.9600.18202

2. Cause

There is a SSIS issue in SQL Server 2104 SP2 CU10 for which a fix will ship in SP2 CU12.

3. Possible workarounds

On AX side, the issue will repro only when "Create error file" is enabled in "Data import/export framework parameters". Thus, as a workaround, users can un-check this option until the SSIS fix has shipped.

Another option is to uninstall the SQL Server CU that is causing the issue and wait for CU12 to ship.

Please implement this in TEST first, before rolling out in PROD, in line with your change management processes.

Azure Security Center – Custom Alerts

$
0
0

Many my clients have asked can you extend the alerting in Azure Security Center(ASC). The answer is yes as few months back custom alerts went into public preview. Using this allows you to take a log analytics query and have it evaluated in ASC. It's ideal if for example your application generates notable security events or if you are using sources not currently supported in ASC. Creating custom alerts is very simple. A quick run through below is an example –

Create your query either in “Log Search” or in “Analytics”.
Simple query to list Windows Systems that have had their event logs cleaned.Create your query either in “Log Search” or in “Analytics”.

Simple query to list Windows Systems that have had their event logs cleaned.

SecurityEvent

| where (EventID == 1102 or EventID == 517) and EventSourceName == "Microsoft-Windows-Eventlog"

| summarize AggregatedValue = count() by Computer

clip_image002

Now take this query into ASC and open up “Custom Alerts Preview”

clip_image004

Click “New custom alert rule”

clip_image006

Start creating your custom rule as below –

clip_image008

Check your query has pasted correctly by clicking “Execute your search query now”. This should return results similar to those you previously had when building your query in analytics.

clip_image010

Complete the alert criteria, evaluation window etc and click “OK” –

clip_image012

Your custom alert should now be listed –

clip_image014

Now to test, clear some event logs and monitor ASC alerts!

clip_image016

A handy tip when looking at alerts in ASC is the filter option shown above top left. I limited my filter just to low priority to make this alert easier to find –

clip_image018

Hope this is useful!

Load solutions faster with Visual Studio 2017 version 15.6

$
0
0

As we have been working to improve the solution load experience in Visual Studio 2017, you may have read our blog about these improvements in version 15.5. With version 15.6, we have introduced parallel project load, which loads large .NET solutions twice as fast as earlier versions when you reload the same solution. This video compares the time it takes to load a very large solution from the Roslyn repository, with 161 projects, between version 15.5 and 15.6.

solution load performance comparing 15.5 and 15.6

Parallel project load

During the first load of a solution, Visual Studio calculates all the IntelliSense data from scratch. In the previous version of Visual Studio 2017, version 15.5, we optimized the calculation of IntelliSense data by parallelizing the design-time build that produces the data. Solutions that were opened for the first time on a machine loaded significantly faster because of this parallelization. IntelliSense data was cached, so subsequent loads of a solution didn’t require a design-time build.

With version 15.6 we wanted to go one step beyond optimizing IntelliSense calculation. We've enabled parallel project load for large solutions that contain C#, VB, .NET Core, and .NET Standard projects. Many Visual Studio customers have machines with at least 4 CPU cores. We wanted to leverage the power of all the CPUs during solution load by loading projects in parallel.

We continuously monitor solution load telemetry coming in through the Customer Experience Improvement Program. We saw that, in aggregate, customers experienced a 25% improvement in solution load times in version 15.6, across all solution sizes. Large .NET solutions experienced even larger improvements, and now load twice as fast as previous versions. A customer with a very large, 400+ project solution told us that their solution now loads 2-4 times faster!

Solution load is getting leaner

Parallel solution load is part of the work we’re doing to improve solution load. Another big effort is to get unneeded components out of the way of solution load. Historically, many components and extensions used solution load to perform initialization work, which added to the overall solution load time. We are changing this to make Visual Studio developers productive as quickly as possible. Only critical code that enables navigation, editing, building, and debugging will run during solution load. The rest of the components initialize asynchronously afterwards.

For example, Visual Studio previously scanned git repositories during solution load to light up the source control experience. Not anymore. This process now starts after the solution has loaded, so you can start coding faster.

As another example, Visual Studio used to synchronously initialize the out-of-process C# language service. This code is now optimized to reuse data structures already available in the Visual Studio process. We’re also working with extension owners to delay operations that do not impact the solution load process until after the load has completed.

We have also optimized the Visual Studio solution loader to batch and run all critical solution load operations before any other work can run. Previously, the Visual Studio solution loader fired a subset of solution load events asynchronously. This allowed lower priority work to interfere with solution load. Additionally, there are popular extensions that listen to these events and can block Visual Studio for seconds. This experience confused some customers, because it was not clear when solution load was complete. This logic is now optimized, so that all solution load events fire synchronously during solution load.

While solution load is significantly faster in version 15.6, we are not done yet. You will see us making solution load even leaner in future updates.

Know what slows you down

Even with these improvements, it’s still possible for slow blocking operations to get scheduled after the solution has loaded. When this happens, the solution appears loaded, but Visual Studio is unresponsive while it is processing these operations. Visual Studio version 15.6 detects blocking operations and presents a performance tip. This helps extension authors find these issues and gives end users more control over their IDE’s performance.

Performance tip in the IDE showing operations that cause delays

Extension authors can use an asynchronous API that allows extensions to run code without blocking users. We’ve also published guidance for extension authors to learn how to diagnose and address responsiveness issues. If you regularly see unresponsiveness notifications for an extension, reach out to the owner of the extension or contact us at vssolutionload@microsoft.com.

Let us know

We would love to know how much faster your solution loads in version 15.6. Give it a try and let us know by sending an email to vssolutionload@microsoft.com. You can also tweet about it and include @VisualStudio.

Viktor Veis, Principal Software Engineering Manager, Visual Studio
@ViktorVeis

Viktor runs the Project and Telemetry team in Visual Studio. He is driving engineering effort to optimize solution load performance. He is passionate about building easy-to-use and fast software development tools and data-driven engineering.

Viewing all 35736 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>