Quantcast
Channel: MSDN Blogs
Viewing all 35736 articles
Browse latest View live

Microsoft Dynamics 365 with SQL Server 2016 Cumulative Update 2 (CU2)

$
0
0

Applies To:  Dynamics 365 (on-premises)

 

You can now run Dynamics 365 (on-premises) with SQL Server 2016 CU2 using database compatibility level 110 (SQL Server 2012 compatibility level). The application’s performance is improved when you set the organization database with trace flag 1224. This setting reduces the CPU usage by disabling lock escalation based on the number of locks. For details, please see Improve performance when you use Microsoft Dynamics 365 with SQL Server 2016.

The performance of SQL Server 2016 is also improved with the High Availability Enhancements. The obsolete Database Mirroring technology has been replaced with a more efficient Always On Basic Availability Groups for high availability.

SQL Server 2016 comes with a set of rich capabilities that can help you solve and meet your business needs. The Dynamics 365 engineering team continues to evaluate these capabilities to enhance the application’s performance in upcoming releases. An example of these capabilities that we are evaluating is the ability to perform Native compilation of Tables and Stored Procedures. This capability allows faster data access and more efficient query execution for better performance. Another capability that you should consider is the In-memory Columnstore. In-memory Columnstore uses column compression to reduce the storage footprint and improve query performance for data warehousing scenarios.

Please see What’s New in SQL 2016 (Database Engine) to help you maintain and fine-tune your applications.

 

See Also:

 

– Paul Liew


10 leiðir til að nota Onenote í kennslu

$
0
0

Við förum ekkert ofan af því að Onenote er eitt öflugasta verkfæri sem kennarar geta notað með sínum nemendum. Fiona Beal, kennari og rithöfundur í Suður-Afríku, skrifaði áhugaverða grein þar sem hún tiltekur 10 árangursríkar leiðir fyrir kennara til að nota Onenote í kennslu. Við hvetjum ykkur til þess að lesa greinina, skoða myndböndin sem þar er að finna og prófa þá hluti sem hún tiltekur, hér verður stiklað á stóru.

Hún, eins og fleiri, mælir með því að kennarar noti Onenote 2016 sem er langbesta útgáfan af Onenote. Hún mælir einnig með því að kennarar:

  1. geri to-do lista fyrir daginn og/eða vikuna
  2. setji liti á síður og jafnvel línur ef þess þarf
  3. setji inn skjöl, ýmist sem viðhengi eða ‘print-out’
  4. tileinki sér ‘digital inking’ með því að skrifa beint á skjáinn
  5. haldi utan um greinar og vefsíður
  6. noti Onenote clipper
  7. tileinki sér notkun sniðmáta (e: template)
  8. tali inn minnispunkta með því að taka upp hljóð
  9. taki upp myndbönd og geymi í Onenote
  10. setji upp og haldi utan um kennsluáætlanir

DevOps for IoT ( Part 2) – Installing an app from within an app

$
0
0

In this blogpost I’m going into details for a specific part of the End-To-End implementation – how to install apps from within an app. In a previous post I described the allover scenario to set up a full CI/CD chain for an IoT application running on Windows 10 IoT Core, orchestrated by Visual Studio Team Services. If you are already familiar with UWP apps you might wonder how installation of a new application is realized on Windows IoT Core. (If you are not working with Windows 10 IoT Core as your IoT platform you may have to come up with a similar solution for your device/platform. )

 

image

On Windows 10 devices in general capabilities of Universal Windows Apps are limited per default for security reasons. Part of the unavailable functionality is installation of software directly from a downloaded file, this means it is not possible to simply download an app and then trigger the installation of that downloaded app from within a running application without user interaction. To recap: This is exactly what we want in our IoT Scenario. I want to update the application without any user interaction based on a downloaded file.

However Windows 10 IoT Core gives you (the developer) control over the device and you can access additional capabilities if you explicitly allow them for your application. So what I basically have to do is two things.

  1. I have to add additional capabilities within the app manifest
  2. In this specific case I have have to add a special registry key on all devices to allow installation of apps from within an app

While #1 is not a problem at all and clearly just a developer task #2 might make you frown because you could think it is something that has to be done manually. If we stick with the refrigerator sample this would mean I have to add registry keys in millions of refrigerators. First it’s important to know that this has to be done only once per devices. Second of course this isn’t something you would do manually instead you could e.g. put this into a custom Windows 10 IoT Image you might be providing  for your device anyway.

It’s important to note that besides the application which will be the IoT application you are really working on (e.g. the refrigerator control) I’m using a second application which handles installation and updates of the refrigerator control. While the Refrigerator App is really “just” a normal app, the second Installer App” has extended capabilities. This app is also required to communicate with IoT Hub.

image

Modifing the *.appxmanifest

To add the additional capabilities add the following namespaces in your app manifest of your Installer app.

xmlns:iot=”http://schemas.microsoft.com/appx/manifest/iot/windows10″ xmlns:rescap=http://schemas.microsoft.com/appx/manifest/foundation/windows10/restrictedcapabilities 

and add iot and rescap to the list of ignorable namespaces. You have to do this in XML there’s no UI for that. Use the context menu to switch between code view and design view.

 

image

After adding the namespaces add the following capabilities in the capabilities section of your manifest:

<iot:Capability Name=”systemManagement” />
<rescap:Capability Name=”packageQuery” />

 

image

 

Modifing the device registry

Now you have to modify the registry on your IoT Device. You need the installation folder of your application. Therefore you can either simply deploy your app to the device and remote into the system (e.g. via Powershell) to find out the path or you guess it based on the default values.

You add the key by running a command like this on Powershell remoted into your device :

REG ADD “HKLMSOFTWAREMicrosoftWindowsCurrentVersionEmbeddedModeProcessLauncher” /v AllowedExecutableFilesList /t REG_MULTI_SZ /d “c:windowssystem32applyupdate.exec:windowssystem32deployappx.exec:installerappinstall.cmdc:DataUsersDefaultAccountAppDataLocalPackagesPACKAGEFAMILYNAMELocalStateinstallerAppInstallappinstall.cmd

This key contains executables which are allowed to run. You have to add the correct path to your executable which executes the installation. In my case I’m downloading the appinstall.cmd and the app from FTP into the LocalState folder of my installer app. Therefore I have to adjust the path using the Package Family Name of the installer App (as found in appxmanifest, see below) and had to adjust the path to the *.cmd depending on the directory structure I’m using in my downloaded files.

image

 

When you have completed this you made sure your installer app is now able and allowed to install apps on your target device.

 

Installing an app from within an app

To trigger the installation what you do is just call the process launcher and pass in the path to your executable which installs the app. In my case as stated above, this is the appinstall.cmd file.

// var cmd = something like  …LocalStateinstallerAppInstallappinstall.cmd” depending on your structure

var result = await ProcessLauncher.RunToCompletionAsync(cmd, args);

You may wonder where I got that appinstall.cmd file from. This is something that my private build agent creates which has the Windows 10 ADK installed.

 

This post showed you how to set up an installer app which can install and update another app. This is one of the key “tricks” to get the scenario working if you’re using Windows 10 IoT Core. If you’re not using Windows 10 IoT Core you might have to deal with challenges differently to get an app installed without user interaction. I will point out other interesting aspects of the full DevOps scenario in a subsequent post.

Subscription usage and quotas in the Azure portal

$
0
0

The Azure team recently announced some new features in the preview portal:

https://azure.microsoft.com/en-us/updates/get-early-access-to-new-portal-features-2/

Of particular interest is the new Subscription Usage + quotas view that displays a graphical view of your subscription’s current limits:

image

You can filter the list by service, provider, and location, and restrict the view to only those items currently used or all items. The page has a handy link to initiate a quota increase request. Additionally, you can sort by the Usage column in descending order to get an idea of the quotas that are nearing their limit. This is extremely useful for planning future usage increases.

The preview portal is located here: preview.portal.azure.com

For more detailed information on quotas, please see the following documentation: Azure subscription and service limits, quotas, and constraints

Performance issues with Visual Studio Team Services – 01/30 – Resolved

$
0
0

Final Update: Monday, 30 January 2017 21:18 UTC

We’ve confirmed that all systems are back to normal as of 30 January 2017 ~21:20 UTC. We are aware of the root cause and have taken measures to make sure the issue is not repeated. Customers should now have no issues to view, change, and query custom fields in inherited process through all clients.

Sincerely,
Manjunath


Update: Monday, 30 January 2017 19:32 UTC

Investigations have narrowed the root cause to an issue in Client OM backward compatibility logic. We have applied temporary mitigation, while we are preparing a fix. The mitigation includes temporarily disabling backward compatibility logic. While the backward compatibility logic is disabled, customers will not be able to view, change and query custom fields from inherited processes in Visual Studio, Excel, and any other tools that use SOAP APIs. Customer will still be able to interact with custom fields in Web Access.


  • Impacted Region : South Brazil
  • Next Update: Before 23:30 UTC


Sincerely,
Manjunath


Update: Monday, 30 January 2017 17:55 UTC

Our DevOps team continues to investigate issues with Visual Studio Team Services in the South Brazil region. We are working on mitigating the issue and understanding the root cause. Customers would continue to observe slowness while accessing their accounts.


  • Impacted Region : South Brazil
  • Next Update: Before 21:00 UTC


Sincerely,
Manjunath


Update: Monday, 30 January 2017 15:32 UTC

We are still investigating issues with Visual Studio Team Services. Some customers in South Brazil region may experience performance issues while trying to connect to their accounts. Our DevOps teams is working on identifying the root cause of the issue.

  • Impacted Region : South Brazil
  • Next Update: Before 18:00 UTC


Sincerely,
Rakesh Reddy


Update: Monday, 30 January 2017 14:35 UTC

We are actively investigating issues with Visual Studio Team Services. Some customers may experience performance issues while trying to connect to their accounts. We are in the process of assessing the impact.

  • Next Update: Before 16:00 UTC


Sincerely,
Vamsi

Troubleshooting Miracast connection to the Surface Hub

$
0
0

Connecting via Miracast to the Surface Hub is a very common scenario. To get this scenario to work, you need the sending device (mobile phone, laptop) to connect to the Surface Hub using a WIFI Direct connection. In some cases, this can fail due to a few reasons. Here are a few steps you can try to troubleshoot this problem:

  • Make sure your sending device supports Miracast: Press WinKey + R and type dxdiag. Click on  “Save all information” – open the saved dxdiag.txt – look for Miracast. It should say “Available, with HDCP”
  • If the Connect app displays: “The device doesn’t support Miracast, so you can’t project to it wirelessly.” Please go to Settings -> Network -> WIFI and enable WIFI.
  • Update the drivers on the laptop for WIFI and video (you can also try uninstalling them and then reinstalling them)
  • Hit the WinWey + R and type “rsop.msc” to execute the “Resultant Set of Policy” MMC snap-in.  It’ll take a moment to analyze the group policies currently applied on the client. In the left hand window, navigate to Computer Configuration->Windows Settings->SecuritySettings->Wireless Network (IEEE 802.11) Policies.  In the right-hand pane you should see a group policy object setting wireless policies. Double click it and a dialog will appear. Open the Network Permissions tab and check the value of the “Allow everyone to create all user profiles” box, as pictured here.

updatedblog

This value has to be selected, this will enable your sending device to create an ad-hoc network which is needed to send the video stream to the Surface Hub. Ensure that “Don’t allow Wi-Fi Direct groups” is checked off.  Wi-Fi Direct provides the functionality the allows direct device-to-device connectivity for Miracast; allowing a display device and other sources to discover each other.  This setting can also be manually changed in the registry by deleting the “Windows XP” folder found under ComputerHKEY_LOCAL_MACHINESOFTWAREPoliciesMicrosoftWindowsWireless.  A reboot is required after making this change.

 

  • Change the default band used by Miracast on the Surface Hub: Settings – This device – Wireless Projection – Miracast Channel. Here you can test with the default, if that does not work, try one of the bands supported in your region (search for “Miracast” on the page). If you are seeing slow refresh speeds, you might want to use of the 5 GHz bands, as the 5 GHz spectrum is generally less crowded.

Important Note –  Once you changed the channel on the Hub, you need to reboot the Hub before testing again, else the change will not be effective.

  • In some cases, third party Anti-Virus software can prevent Miracast connections taking place. Please use a test machine to test by uninstalling the Anti-Virus software, reboot and then try again.
  • An enabled firewall rule may also cause Miracast connections issues. Add a firewall rule which allows for service “C:WindowsSystem32WUDFHost.exe” – Allow INOUT connections for TCP and UDP protocols. Ports: All

January 2017 Update for ASP.NET Core 1.1

$
0
0

We just released an update for ASP.NET Core 1.1 due to Microsoft Security Advisory 4010983. The advisory is for a vulnerability in ASP.NET Core MVC 1.1.0 that could allow denial of service. All of the information you need is in the advisory. A short summary is provided below.

Red Hat customers should consult the Red Hat advisory for the same issue.

How to Obtain the Update

The update is in the Microsoft.AspNetCore.Mvc.Core package. You need to upgrade your project to use version 1.1.1 (or later) of the package and then re-publish your application.

See below for examples of project file updates, for project.json and csproj formats. Note the updated Microsoft.AspNetCore.Mvc.Core package version.

Project.json

The dependencies section of an updated project.json file would look like the following (in its most minimal form).

"dependencies": {
"Microsoft.NETCore.App": {
    "version": "1.1.0",
    "type": "platform"
},
"Microsoft.AspNetCore": "1.1.0",
"Microsoft.AspNetCore.Mvc.Core": "1.1.1",
}

CSProj

An updated csproj file would look like the following (in its most minimal form):

<Project Sdk="Microsoft.NET.Sdk.Web">
  <PropertyGroup>
    <TargetFramework>netcoreapp1.1</TargetFramework>
  </PropertyGroup>
  <PropertyGroup>
    <PackageTargetFallback>$(PackageTargetFallback);portable-net45+win8+wp8+wpa81;</PackageTargetFallback>
  </PropertyGroup>
  <ItemGroup>
    <PackageReference Include="Microsoft.AspNetCore" Version="1.1.0" />
    <PackageReference Include="Microsoft.AspNetCore.Mvc.Core" Version="1.1.1" />
  </ItemGroup>
</Project>

Learn more

You can ask questions on the aspnet/mvc repo, where a discussion issue has been created.

Creación automatizada de proyectos en Visual Studio Team Services

$
0
0

Una de las cosas que más me gustan por ejemplo de GIT, es la facilidad con la cual añades un origen externo (por ejemplo desde Team Services, Azure o GitHub), bajas el código y comienzas a trabajar de manera segura sobre tu solución; y todo sin salirte de una simple consola de comandos.

Sin embargo una de las piezas faltantes que no podía ejecutar desde la consola, es la creación del repositorio o proyecto como tal en Team Services. Entonces tenía que hacer una pausa, ir al portal, loguearme si no lo estaba, crear el proyecto, buscar la url del git y luego sí conectar todo con mi cliente de desarrollo.

Obviamente Team Services posee una API Rest para poder ejecutar operaciones programáticamente. Así que decidí investigar como hacer una aplicación de consola que me permitiera crear el repositorio en Team Services de manera automática sin entrar al portal.

Una forma muy sencilla de poder acceder al API de Team Services es a través de los “Personal Access Token” (PAT). Si tienes una cuenta de Team Services, lo puedes obtener desde el portal (solo hay que hacerlo una vez). Entrando a la parte de seguridad dentro de tu cuenta:

image

 

Allí de una vez encontrarás el acceso a la administración de los PAT relativos a tu cuenta de Team Services. Para un mejor control recomiendo generar un PAT único para tu aplicación de creación de proyectos. Puedes escoger qué permisos otorgará ese PAT sobre los servicios de Team Services y por cuánto tiempo será válido. La idea es que luego uses el PAT para hacer tus llamados a la API sin necesidad de un usuario y contraseña o tal vez de doble autenticación.

image

Quise desarrollar este proyecto usando .NETCore, (si no estás familiarizado con .NETCore, puedes ver este corto video que creé para mostrar de qué se trata muy rápidamente), de manera que cuando esté en Linux o acaso en MacOS, igual pueda usar la herramienta. Es un proyecto que hace uso extensivo del archivo de settings para .NETCore, que como ya es estándar, he llamado appsettings.json:

{

   "domains": {
     "warnov": "YOUR PAT",
      "sample": "SampleAccountPATlaxs3zhqf74rcflkxqhufbmhnhpmma"  
},
   "templates": {
      "scrum": "6B724908-EF14-45CF-84F8-768B5384DA45",
      "agile": "ADCC42AB-9882-485E-A3ED-7678F01F66BC",
     "CMMI": "27450541-8E31-4150-9947-DC59F998FC01"  
},
   "sourceControl": [ "Git", "Tfvc" ], 
"defaults": {
     "sourceControlProvider": "git",

      "template": "6B724908-EF14-45CF-84F8-768B5384DA45"  

},

   "apiPath": "https://{0}.visualstudio.com/_apis/projects?api-version=2.2" }

 

El trabajo con este tipo de archivos en .NETCore es algo especial comparado a como hacíamos antes con archivos como el Web.Config o App.Config. Empezando porque ya no son archivos XML sino JSON. Para una guía detallada para aprender a manejar estos tipos de archivos, consulta este post que escribí al respecto.

En primera instancia encontramos una sección de dominios, donde especificamos el PAT de cada uno de ellos. Esto quiere decir que por ejemplo en el código anterior, puedo trabajar con Team Services en warnov.visualstudio.com y sample.visualstudio.com, pues se supone que allí he especificado los PAT válidos para cada uno de ellos.

Luego tenemos los templates; como lo puedes ver en las llaves, hacen referencia a si vamos a trabajar con scrum, agile o CMMI el ciclo de vida de nuestro proyecto de software. Generalmente escojo scrum.

Después vienen las opciones que tendremos para administrar el código fuente. Hoy en día Team Services ofrece Git y Tfvc (Team Foundation Version Control). Últimamente estoy optando por usar más git por lo simplista de su acercamiento.

También creé una sección de valores por defecto tanto para el proveedor de source control y el template a usar; esto para que mi aplicación pueda funcionar sin necesidad de que yo especifique todos los parámetros cuando la llame.

Para finalizar, especifico el formato de la ruta del API a usar. Sucede que ésta API está en constante evolución y me ha pasado que la URL cambia ligeramente. O acaso quieras usar otra versión de la API. En ese caso, basta con modificar este formato. Observa el {0}: éste va a ser reemplazado con uno de los nombres de dominios que has especificado arriba y que será aquel que escojas para trabajar.

Ahora sí, let’s get fun!

 

if (args.Length > 0)
{
    workingDomain = args[0];
    workingProject = args[1];
    description = args[2];
    var defaults = Config.GetSection("defaults");
    sourceControlProvider = defaults["sourceControlProvider"];
    templateId = defaults["template"];
}

Sencillamente evalúo si he pasado argumentos a la herramienta. En el primero vendría el dominio… por ejemplo warnov si quiero trabajar con warnov.visualstudio.com. Luego el nombre del proyecto. Por ejemplo: Picas y Fijas on the Cloud y finalmente la descripción del proyecto: “Maravilloso y revolucionario juego interactivo”. Como se aprecia, los otros valores son extraídos de la sección de defaults que definí en mi appsettings. de hecho, esto les da la idea de cómo se trabaja con ese archivo.

Si no paso parámetros, los pido de manera interactiva:

else
{
    //Domain
    workingDomain = GetDomain();
    //Project Name
    Console.WriteLine("Project Name: ");
    workingProject = Console.ReadLine();
    //Project Description
    Console.WriteLine("Project Description: ");
    description = Console.ReadLine();

    //SOURCE CONTROL           
    sourceControlProvider = GetSourceControlProvider();

    TEMPLATE ID
    templateId = GetTemplateId();                 }

y eso es todo. Luego pedimos los otros valores requeridos para la conexión, como el PAT; armamos el mensaje a enviar al API, lo enviamos usando POST y luego con el resultado mostramos un mensaje al final de la ejecución del programa.

//PAT
var pat = domainsSection[workingDomain];
//Message Body
var messageBody = GetMessageBody(workingProject, description, sourceControlProvider, templateId);
//PostExecution
var postResult = ExecutePost(workingDomain, pat, messageBody);
Console.WriteLine($"{postResult}nnPress Enter to Finish...");
Console.ReadLine();

Mensaje a enviar:

ProjectCreator body = new ProjectCreator(workingProject, description, sourceControlProvider, templateId);
var jsonBody = JsonConvert.SerializeObject(body);
return new StringContent(jsonBody, Encoding.UTF8, "application/json");

Observa que hay una clase modelo que se llama ProjectCreator. Desarrollé ésta clase a partir de la documentación del API, usando la metodología que he descrito en este post. Es una metodología que ayuda mucho a poder conectarse a todas las APIs modernas basadas en REST y JSON.

Pues bien, una ves inicializo ésta clase con los valores requeridos para la creación de mi proyecto en Team Services, la convierto a un JSON que luego paso a un StringContent que es el objeto provisto por  el framework que prepara los datos y los deja listos para ser incluidos en el body de un mensaje post que incluye data en formato texto para ser enviada al servidor.

Finalmente pasamos al envío del mensaje:

private static string ExecutePost(string workingDomain, string pat, StringContent messageBody)
{
    string responseBody=string.Empty;
    using (HttpClient client = new HttpClient())
    {
        string _credentials = Convert.ToBase64String(System.Text.ASCIIEncoding.ASCII.GetBytes(string.Format("{0}:{1}", "", pat)));
        client.DefaultRequestHeaders.Accept.Clear();
        client.DefaultRequestHeaders.Accept.Add(new System.Net.Http.Headers.MediaTypeWithQualityHeaderValue("application/json"));
        client.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue("Basic", _credentials);
        var url = Config["apiPath"];
         var request = new HttpRequestMessage(HttpMethod.Post,
            string.Format(url, workingDomain))
        {
            Content = messageBody
        };                       try
        {
            var response = client.SendAsync(request).Result;
            response.EnsureSuccessStatusCode();
            var code = response.StatusCode;
            if (response.IsSuccessStatusCode)
            {
                responseBody = $"{code}: {response.Content.ReadAsStringAsync().Result}";
            }                                                          }
        catch (Exception exc)
        {
            responseBody = $"ERROR: {exc.Message}. {exc.InnerException?.Message}";
        }
    }
    return responseBody;
}

En primera medida ajustamos las credenciales de acceso al API, encodificando el PAT que hemos obtenido. Luego ajustamos el header de autorización con esta data. También indicamos que el API maneja mensajes JSON.

Después armamos la url de envío de nuestro request y adicionamos el mensaje que armamos previamente.

Finalmente enviamos el request y le pedimos al sistema que si no nos responden con éxito lance una excepión, a través de response.EnsureSuccessStatusCode();

Observa que la respuesta inicial no va a ser un 200, porque la operación no finaliza en bien recibimos la respuesta. La creación de un proyecto toma algún tiempo así que lo que es retornado por el servidor de Team Services es el código 202 que nos indica que todo está bien, pero que debemos esperar a que la operación termine (cosa que por lo general podemos hacer sin tomar acción especial distinta a esperar algunos segundos). También se nos retorna una URL de seguimiento del estado del request por si queremos automatizar la herramienta para que ella misma cree el repositorio local una vez haya quedado creado en la nube. Pero eso haría parte de otro post.

Así que después solo resta inicializar nuestro repositorio git, y agregarle el remote con dirección similar a esta:
https://[yourdomain].visualstudio.com/DefaultCollection/_git/[yourprojectname]</STRONG]

y voilá… habremos podido crear todo lo necesario para nuestro ambiente de desarrollo local soportado por Team Services, sin necesidad de salir de la consola!

Todo el código de esta solución lo pueden encontrar en mi Github. Pueden agregarle más funcionalidades basados en el API de Team Services o modificar su comportamiento. Estaré atento a sus push requests!

github-512


How to ensure your toast notifications continue to work when converting your win32 app into a Windows Store app using project Centennial

$
0
0

[Originally posted by Lei Xu]

 

As you may already know, developer can now bring their existing apps and games to the Windows Store with the Desktop Bridge, known as project Centennial.

For developers who has been integrating with toast notification features in Windows 8 and Windows 10, when you turn your app into a Windows Store app using project Centennial, you may realize that your toast notification does not work as desired – an incorrect/unresolved string will show up in Action Center at where the application title is supposed to be shown. This post summarizes on why it happens and how to fix it with a simple change to your code.

Why does it break?

In Windows 8/8.1, we enabled support for desktop Win32 apps to send local toast notifications – by enabling win32 apps to do 2 things:

  • be able to register with the notification platform and;
  • call WinRT APIs to send toast.

What did we do to enable those respectively?

  • A registration with notification platform requires an app identity, but win32 apps don’t really have modern app identities. This problem is solved by by allowing win32 apps to provide us with a “fake” app identity – through creating a desktop shortcut.
  • The app will then need to call the method overload that takes this manually created application id to indicate which identity is trying to send this notification, as shown:
    ToastNotificationManager.CreateToastNotifier(appId).Show(toast);

Windows Store apps converted through project Centennial on the other side, already come with proper app identities just like any regular UWP app, so if the win32 app is converted without changing any of its previous code, then it will create a conflict as below:

  • The shortcut creation will be skipped for the converted Store app during app deployment because the deployment path is different, however;
  • The app will still call the same overload mentioned above to pass in the fake app id that the OS can no longer resolve, since the shortcut is not there, thus results in unresolved string.

How to fix it?

Any app in this situation will need to simply change a line of their code to call

ToastNotificationManager.CreateToastNotifier().Show(toast);

without passing in the manually created appId and just let the app identity get resolved by the OS like any other modern apps do.

Why didn’t we fix this for you?

The same method overload can also be used by Windows Store apps to send toast notifications on behalf of other apps in the same package. Once a win32 app is converted to Windows Store app through project Centennial, Windows just see it as a regular Store app without knowing its true intention behind calling CreateToastNotifier(appId). To prevent us from getting your real intention wrong and further break some other unexpected scenarios, we ask the developers to make the change here during your conversion process.

Dynamics Retail Discount Extensibility – Scenarios II

$
0
0

Let’s have more scenarios.

Scenario – mix and match with flexible quantity setup

Out of the box, Dynamics Retail Discount Engine supports mix and match with fixed quantity setup. For example, buy one Xbox console, get 2 games 20% off. But you also want to have mix and match with flexible quantity setup, for example, buy a phone, get 20% off of all its accessories, or get $10 off total of all its accessories.

Let’s distill it a bit more. Essentially, we have two line groups of products:

  • Qualifying line group. In this example, a phone.
  • Application line group. In this example, the phone’s accessories.

There is a (small) chance that we may add support out of the box in the future.

Scenario – quantity discount with tiered application

Out of the box, Dynamics Retail Discount Engine supports quantity discount with unified application. For example, get 20% off if you get 4+ keyboards. If you buy 4+ keyboards, you would get 20% off for all the keyboards.

Quantity discount with tiered application is different, in the same example above, if you buy 4+ keyboards, first three are not discounted, and additional ones get 20% off.

In summary, for both scenarios, we are talking about new discount types.

Related: Dynamics Retail Discount Extensibility – Scenarios I

Ignite AU 2017 discount and joining in the Monday Meetup Madness

$
0
0

Ignite Australia 2017

Good news, we managed to secure discounted tickets for BizSpark members for another Microsoft Ignite in Australia. It is fast approaching (two weeks so get moving!!)

Some of you might be asking, what is the Microsoft Ignite conference?
We’re bringing together the best of all your favourite tech events – from small to large, including TechEd – into a single tech-fuelled event. Whether you’re a developer or an IT Pro you can get answers, test-drive technologies, and connect with peers, tech leaders, and Microsoft experts.

When/Where is it?
14-17 February 2017 at the Gold Coast Convention & Exhibition Centre.

How much are tickets?
Tickets are normally $1800.00 for Earlybird and $2200.00 for a Standard ticket. But if you’re a BizSpark member, you can send us an email at BizSpark_AU@microsoft.com with your startup name and I’ll send you a unique discount code that will entitle your startup to 2 tickets at a discounted rate of $1600.00 per ticket.

Monday Meetup Madness

If you are attending ignite, or you are just near the Gold Coast on Monday 13th Feb PM, you should join fellow techies at the Monday Meetup Madness in the convention centre. There are many different meetup groups where you can hear experts talk about SQL, .NET, Office 365/SharePoint, Xamarin for cross platform dev, Azure and more! This is also a great chance to meet with the local experts in these areas. Join the group that interests you!

Faster monkeys

$
0
0

Last time I showed how to simulate monkeys typing on typewriters, using letter frequencies based on input text, like Hamlet’s soliloquy.

The results were remarkably similar to the input text, but the output was relatively slow.

Below is a version that has the “Optimum” option that outputs much faster if the “-o” command line parameter is used.

On my machine, the slow version spits out one character every 1.2 seconds for a pattern length of 8.

With the “-o” option, creating the initial data structure takes a few seconds longer, but the text comes out almost a thousand times faster: 980 characters per second.

How can this be?

The slower version uses a single Dictionary<string,int>. The key is the length of a pattern, say “To Be or”. The value is the number of times that pattern exists in the original text. To generate text, if the prior pattern is “To Be o” then all the sum of all the values that start with “To Be o” are found, then a random one is chosen.

The faster version is essentially a dictionary of dictionaries: Dictionary<string, Dictionary<string, int>>, which precalculates the sums.

 

<code>

using System;
using System.Collections.Generic;
using System.Diagnostics;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
// Start Visual Studio
// File->New->Project->C#->Windows->Console Application
namespace Monkey
{
  public class Program
  {
    static string txtHamlet =
@"To be, or not to be--that is the question:
Whether 'tis nobler in the mind to suffer
The slings and arrows of outrageous fortune
Or to take arms against a sea of troubles
And by opposing end them. To die, to sleep--
No more--and by a sleep to say we end
The heartache, and the thousand natural shocks
That flesh is heir to. 'Tis a consummation
Devoutly to be wished. To die, to sleep--
To sleep--perchance to dream: ay, there's the rub,
For in that sleep of death what dreams may come
When we have shuffled off this mortal coil,
Must give us pause. There's the respect
That makes calamity of so long life.
For who would bear the whips and scorns of time,
Th' oppressor's wrong, the proud man's contumely
The pangs of despised love, the law's delay,
The insolence of office, and the spurns
That patient merit of th' unworthy takes,
When he himself might his quietus make
With a bare bodkin? Who would fardels bear,
To grunt and sweat under a weary life,
But that the dread of something after death,
The undiscovered country, from whose bourn
No traveller returns, puzzles the will,
And makes us rather bear those ills we have
Than fly to others that we know not of?
Thus conscience does make cowards of us all,
And thus the native hue of resolution
Is sicklied o'er with the pale cast of thought,
And enterprise of great pitch and moment
With this regard their currents turn awry
And lose the name of action. -- Soft you now,
The fair Ophelia! -- Nymph, in thy orisons
Be all my sins remembered.
";
    // Slow way use a single dictionary:
    Dictionary<string, int> _dictData;


    // fast way: use a dictionary of dictionaries
    /// <summary>
    /// An instance of cPriorData for each sequence of letters with length = PatternLength -1
    /// </summary>
    public class cPriorData
    {
      // the # of occurrences of the prior sequence.
      public int cnt;
      // for each single char "a", "b", how many follow the sequence
      // e.g. for text "to be or not to be", and prior = "no",
      // this dict will have one entry 't' == 1
      public Dictionary<char, int> _dictChars = new Dictionary<char, int>();
      public override string ToString()
      {
        return $"{cnt}";
      }
    }

    Dictionary<string, cPriorData> _dictCPriorData = new Dictionary<string, cPriorData>();



    Random _rand = new Random(1);
    int _maxLen = -1;
    int _lenPattern = 2;
    bool _optimum = false;
    string _txt = txtHamlet;

    static void Main(string[] args)
    {
      // create a new instance of "Program", minimizing
      // the use of statics and making it 
      // much more maintainable and testable
      var prog = new Program();
      prog.DoMain(args);
    }

    internal void DoMain(string[] args)
    {
      ProcessArgs(args);
      var sw = new Stopwatch();
      sw.Start();
      if (!_optimum)
      {
        FillDictionaryFromReadingText();
      }
      else
      {
        FillDictionaryFromReadingTextOptimum();
      }
      sw.Stop();
      Console.WriteLine($"Filled dictionary in {sw.Elapsed.TotalSeconds:n2} seconds");

      Console.WriteLine($"Txt len = {_txt.Length:n0} PatLen = {_lenPattern} DictSize = {_dictCPriorData.Count:n0}  Optimum = {_optimum}");
      sw.Restart();
      int nChars = 0;
      if (!_optimum)
      {
        nChars = GenerateRandomText();
      }
      else
      {
        nChars = GenerateRandomTextOptimum();
      }

      sw.Stop();
      if (nChars > 0)
      {
        Console.WriteLine($"Generated {nChars} characters at the rate of {(nChars / sw.Elapsed.TotalSeconds):n2} chars per second");
      }
    }

    private void ProcessArgs(string[] args)
    {
      for (int i = 0; i < args.Length; i++)
      {
        switch (args[i][0])
        {
          case '/':
          case '-':
            if (args[i].Length > 1)
            {
              switch (args[i][1])
              {
                case 'f':
                  if (++i < args.Length)
                  {
                    _txt = System.IO.File.ReadAllText(args[i]);
                    Console.WriteLine($"Filename {args[i]}");
                  }
                  break;
                case 'm':
                  if (++i < args.Length)
                  {
                    if (int.TryParse(args[i], out _maxLen))
                    {
                      Console.WriteLine($"Max len = {_maxLen}");
                    }
                  }
                  break;
                case 'l':
                  if (++i < args.Length)
                  {
                    if (int.TryParse(args[i], out _lenPattern))
                    {
                      Console.WriteLine($"Pat len = {_lenPattern}");
                    }
                  }
                  break;
                case 'o':
                  _optimum = true;
                  break;
              }
            }
            break;
        }
      }
      if (_maxLen > 0 && _txt.Length > _maxLen)
      {
        _txt = _txt.Substring(0, _maxLen - 1);
      }
      //_txt = "the quick brown fox jumps over the lazy dog";
      //_lenPattern = 2;
    }

    private void FillDictionaryFromReadingText()
    {
      _dictData = new Dictionary<string, int>();
      var txtLen = _txt.Length;
      var strPrior = " ";
      for (int i = 0; i < txtLen; i++)
      {
        var chrNew = char.ToLower(_txt[i]);
        strPrior += chrNew;
        if (strPrior.Length < _lenPattern)
        {
          continue;

        }
        if (_dictData.ContainsKey(strPrior))
        {
          _dictData[strPrior] += 1;
        }
        else
        {
          _dictData[strPrior] = 1;
        }
        // lop off first char
        strPrior = strPrior.Substring(1);
      }
    }

    private int GenerateRandomText()
    {
      var strPrior = string.Empty;
      int nChars;
      for (nChars = 0; nChars < 1000000 && !Console.KeyAvailable; nChars++)
      {
        if (string.IsNullOrEmpty(strPrior))
        {
          // we need some sort of initial prior string
          // so we'll take it from a random entry in the data
          int numSkip = _rand.Next(Math.Min(10000, _dictData.Count));
          if (_dictData.Count < 30)
          {
            // when testing with very few items
            numSkip = 0;
          }
          strPrior = _dictData
              .Skip(numSkip)
              // look for one that is preceded by " "
              .SkipWhile(
                  d => !d.Key.StartsWith(" "))
              .First()
              .Key
              .Substring(_lenPattern - 1);
          // we need to print the new prior string. 
          // indicates it's a restart by using Upper case
          Console.Write(strPrior.ToUpper());

        }
        // first we find the sum of all the items that start with strPrior
        int sum = _dictData
            .Where(
            d => d.Key.StartsWith(strPrior))
            .Sum(d => d.Value);
        if (sum == 0) // last sequence in text
        {
          strPrior = " ";
          continue;
        }
        // now we get a random number from 0 to the sum
        int targetRand = _rand.Next(sum + 1);
        var target = _dictData
            .Where(d => d.Key.StartsWith(strPrior))
            // we subtract each entry's value until we reach 0
            .Where(d => (targetRand -= d.Value) <= 0)
            .FirstOrDefault();
        var newchar = target.Key.Substring(_lenPattern - 1);
        strPrior = target.Key.Substring(1, _lenPattern - 1);
        Console.Write(newchar);
      }
      Console.WriteLine();
      return nChars;
    }


    private void FillDictionaryFromReadingTextOptimum()
    {
      var strPrior = " ";
      for (int i = 0; i < _txt.Length; i++)
      {
        // read a character from the text
        var chrNew = _txt[i];
        if (strPrior.Length < _lenPattern)
        {
          strPrior += chrNew;
          continue;
        }
        // we now have prior with length = patlen -1 and a new character
        cPriorData pData = null;
        // have we encountered this strPrior before?
        if (!_dictCPriorData.TryGetValue(strPrior, out pData))
        {
          pData = new cPriorData();
          _dictCPriorData[strPrior] = pData;
        }
        // increment the # of times we've seen it
        pData.cnt += 1;
        // for this strPrior, have we seen the chrNew before?
        if (pData._dictChars.ContainsKey(chrNew))
        {
          pData._dictChars[chrNew] += 1;
        }
        else
        {
          pData._dictChars[chrNew] = 1;
        }
        if (strPrior.Length > 0)
        {
          // remove 1st char
          strPrior = strPrior.Substring(1);
        }
        strPrior += chrNew;
      }
    }

    private int GenerateRandomTextOptimum()
    {
      var strPrior = string.Empty;
      int nChars;
      for (nChars = 0; nChars < 1000000 && !Console.KeyAvailable; nChars++)
      {
        if (string.IsNullOrEmpty(strPrior))
        {
          // we need some sort of initial prior string
          // so we'll take it from a random entry in the data
          int numSkip = _rand.Next(Math.Min(10000, _dictCPriorData.Count));
          if (_dictCPriorData.Count < 30)
          {
            // when testing with very few items
            numSkip = 0;
          }
          strPrior = _dictCPriorData
              .Skip(numSkip)
              // look for one that is preceded by " "
              .SkipWhile(
                  d => !d.Key.StartsWith(" "))
              .First()
              .Key;
          // we need to print the new prior string	
          // indicates it's a restart by using Upper case
          Console.Write(strPrior.ToUpper());
        }
        cPriorData pData = null;
        if (!_dictCPriorData.TryGetValue(strPrior, out pData))
        {
          // for some reason, the sequence we have isn't found. 
          // could be that the last sequence in the input text is unique
          // and because it's the end of the text, it's not in the data
          // reset to new random start
          strPrior = string.Empty;
          continue;
        }
        int targetRand = _rand.Next(pData.cnt);
        var target = pData._dictChars
            // we subtract each entry's value until we reach 0
            .Where(d => (targetRand -= d.Value) < 0)
            .FirstOrDefault();
        // remove 1st char from the strPrior and add the target char at the end
        strPrior = strPrior.Substring(1) + target.Key;
        Console.Write(target.Key);
      }
      Console.WriteLine();
      return nChars;
    }

  }
}

</code>

DAC と PRI リソースについて

$
0
0

以前に Desktop App Converter で変換したアプリで、スケーリング対応アセットを使用する方法を解説しました。先週頃(1/23の週)に DAC アプリが、バージョン 1.0.6 にアップデートされました。ドキュメント上は、Icon Extraction とクリーン アップの向上、バグ対応とだけ記載されていますが、使ってみて大きな変更点と言えるのが、提供されるアイコンなどのリソースの扱い方が変更されたことです。具体的には、次のような変更が行われています。

  • アイコンの種類が大幅に増えたこと
    AppLargeTile.scale-xxx.png, AppList..scale-xxx.png,
    AppMedTile.scale-xxx.png, AppSmallTile.scale-xxx.png,
    AppWideTile.scale-xxx.png,
    AppStoreLogo.scale-xxx.png, AppList.targetsize-xxx.png,
    scale は、100、125、150、200、400 の 5 種類で
    targetsize は、16、24、32、48、256 で通常と altform-unplated の 2 種類のために 10 種類になっています。エントリー ポイントの実行ファイル (EXE) にアイコン リソースが含まれていると、アイコン リソースから自動生成されます。
  • スケーリング対応による AppxManifest.xml への影響
    Resource 要素の uap:Scale に 100、125、150、200、400 を定義。
    各種のアセット名は、スケーリング対応(ファイル名に scale-xxx や targetsize-xxx を含まない)。
    uap:DefaultTile 要素が追加されました。
    uap:ShowNameOnTiles 要素が追加されました。
  • -MakeAppx オプションを指定した場合に、 PRI を自動生成。
    resources.pri、resources.scale-100.pri、
    resources.scale-125.pri、resources.scale-150.pri、
    resources.scale-200.pri の 5種類が パッケージ ルートに追加されます。
    -MakeAppx オプションを指定しない場合でも、スケーリング対応アセットになっているので、Appx の作成時には PRI を作成する必要があります。

-MakeAppx オプションを指定しない場合

-MakeAppx オプションを指定しない場合は、次の2種類の対応が考えられます。

  • スケーリング対応アセットを使用しない。
  • PRI リソースを自分で作成する。

スケーリング対応アセットを使用しない場合は、使用されるアセットに応じて AppxManifest.xml を書き換えます。つまり、存在するファイルに書き換えることで、今までと同じようにするということになります。

PRI リソースを自分で作成するには、幾つかの作業を行う必要があります。大まかには、次のような作業が必要になります。

  1. PRI 設定ファイルの作成
  2. MakePri.exe ユーティリティを使って、PRI ファイルを作成します。
  3. パッケージ レイアウトに作成した PRI リソースを配置します。
  4. MakeAppx.exe ユーティリティに「/l」(エル)オプションを指定して、Appxを作成します。
    /lオプションが無いと、AppxManifest.xml に指定したファイル名が存在しないというエラーになります。

PRI ファイルを簡単に作成するには、-MakeAppx オプションの動作と同じにすることが考えられます。この作業のためには、次の手順に従って PriConfig.xml と layout.resfiles の 2 つのファイルを用意しておきます。

  1. [スタート メニュー]-[Desktop App Converter] を起動します。
  2. タスク マネージャーを起動します。
  3. タスク マネージャーで「DACTileLauncher.exe」のコンテキスト メニューから[ファイルの場所を開く]をクリックします。
  4. 「icon_extract」フォルダを開き、 PriConfig.xml と layout.resfiles を作業用のフォルダにコピーしておきます。
    コピーした PriConfig.xml と layout.resfiles は、どこかに原本として保管してください。DAC で変換した場合に、何度も使用するからです。
    PriConfig.xml と layout.resfiles の場所は、DesktopAppConverter.ps1 のスクリプトに記述されています。

作業フォルダですが、次のような構造になっていると仮定します。

  • C:DAC  作業用のルート フォルダ
  • C:DACVLC  DACに指定したパッケージ名のフォルダで、次のサブ フォルダが作成されています。
    PackgeFiles  DAC が出力した Appx レイアウトのフォルダ
    logs  DAC が出力したログ用のフォルダ

「C:DACVLC」フォルダに、PriConfig.xml と layout.resfiles をコピーします。PriConfig.xml の layout.resfiles の指定を「..layout.resfiles」のように書き換えます。この理由は、root 属性に指定されている「」が Appx レイアウトのルートを意味していることから、layout.resfiles を配置したパスに指定するためです。

この作業ができたら、コマンド プロンプトを開いて、「C:DACVLC」フォルダをカレント フォルダにして、次の MakePri.exe コマンドを入力します。

MakePri.exe new /pr .PackageFiles /cf .PriConfig.xml /o

このコマンドで「C:DACVLC」フォルダに、resources.pri、resources.scale-100.pri、resources.scale-125.pri、resources.scale-150.pri、resources.scale-200.pri の 5 つの PRI ファイルが作成されます。この 5 つの PRI ファイルを PackageFiles フォルダへコピーします。

PRI ファイルのコピーができれば、MakeAppx.exe ユーティリティを使って Appx を作成します。

MakeAppx.exe /d .PackageFiles /p VLC.appx /l

MakePri.exe ユーティリティを使う時の注意点としては、次のような点があります。

  • アイコン アセットのファイル名を変更した時には、Resources.pri を作り直す必要があります。
  • Resources.pri を作り直す時は、Appx レイアウト フォルダにコピーした PRI ファイルを削除しておきます。
    MakePri.exe ユーティリティの規定の動作が PRI リソースのマージになっているためです。マージ動作では、変更されていないリソースがあると重複のエラーが発生します。

逆を言えば、アイコン アセットのファイル名を変更しないで、ファイルだけを置き換える場合は、Resources.pri を作り直す必要がないことになります。

-MakeAppx オプションを指定する場合

簡単なのは、PackageFilesAssets フォルダ内のファイル名を同じにして、ファイルを入れ替えて、本番用のアイコン アセットにすることです。このようにした場合は、MakeAppx.exe ユーティリティに「/l」(エル)オプションを指定して、Appxを作成するだけで済みます。もちろん、ファイル名などを変更する場合は、-MakeAppx オプションを指定しない場合で説明したように、MakePri.exe ユーティリティを使って Resources.pri を作成し直す必要があります。

주간닷넷 2017년 1월 10일

$
0
0

여러분들의 적극적인 참여를 기다리고 있습니다. 혼자 알고 있기에는 너무나 아까운 글, 소스 코드, 라이브러리를 발견하셨거나 혹은 직접 작성하셨다면 Gist나 주간닷넷 페이지를 통해 알려주세요. .NET 관련 동호회 소식도 알려주시면 주간닷넷을 통해 많은 분과 공유하도록 하겠습니다.

On .NET 소식

지난 주 ON.NET에서는 F# 소프트웨어 재단의 수석 이사인 Reed Copsey, Jr과 함께 재단에서 진행하는 멘토링 및 발표자 프로그램에 관해 이야기 나누었습니다.

이번 주 ON.NET에서는 David Pine과 함께 Windows 10 IoT Core를 기반으로 개발한 스크린 거울인 magic mirror의 제작 과정에 관해 이야기 나누어보겠습니다.

금주의 패키지: Ammy

XAML은 마이크로소프트가 컴포넌트의 객체를 묘사하기 위해 만든 XML 기반의 선언형 언어입니다. 개발자들과 디자이너들이 컴파일하지 않고 콘텐츠를 쉽게 공유하고 편집할 수 있지만, 툴을 이용하지 않고 직접 콘텐츠를 수정/편집할 경우에는 매우 비효율적일 수 있습니다. 하지만 XAML이 구현하는 것과 같은 아이디어는 다른 포맷으로도 완벽하게 구현될 수 있습니다.
Ammy는 JSON과 Qt’s QML 의 영향을 받아서 탄생하였으며 가볍고 빠르며 확장이 쉽습니다.

금주의 툴: Concurrency Visualizer

Concurrency Visualizer는 다중 스레드 응용 프로그램의 성능을 시각화하는데 도움이 되는 Visual Studio의 중요한 확장 기능입니다. Concurrency Visualizer의 주요 기능으로는 실행 프로세스와 관련 코어의 동작 상태, 스레드 상태, anti-pattern 감지, 성능향상을 위한 제안 등이 포함되어 있습니다.
Sergey Teplyakov가 Concurrency Visualizer를 이해하는데 도움이 되는 포스트인 Understanding different GC modes with Concurrency Visualizer 를 게시하였습니다.

금주의 게임 : Eco

Eco는 생태계와 협업에 중점을 둔 글로벌 생존 게임입니다. 플레이어는 팀을 이루어 문명을 구축해야 하며, 지구를 향해 접근하고 있는 유성을 파괴하기 위해 팀과 협동하여 문명사회를 빠르게 발전시켜야 합니다. 하지만 생태계 균형이 깨지지 않도록 발전하는 것도 중요합니다.
Eco는 플레이어의 행동 하나하나가 인간을 포함한 수많은 생명체에 영향을 줄 수 있는 지구 생태계 시스템을 게임의 기반으로 하였습니다. 만일 생태계 균형이 맞지 않으면 식량 또는 자원이 파괴되고 이로 인해 모든 생명체가 죽음에 이를 수 있습니다. 또한 플레이어는 국가와 정부를 세우고 법과 질서를 통해 치안을 유지하고 경제가 활성화될 수 있도록 문명을 끊임없이 발전시켜야 합니다.Eco

EcoStrange Loop Games에서 C#Unity로 개발하였으며, 웹사이트와 서버는 ASP.NET과 .NET Framework로 구축하였습니다. 현재 알파버전으로 Windows, Mac, 그리고 Linux에서 즐기실 수 있습니다. 또한 Eco는 몇몇 학교에서 교육 목적으로 사용되고 있습니다.

.NET 소식

ASP.NET 소식

F# 소식

새로워진 F#

F# 커뮤니티에서 연재하는  주간 F#에서 더욱 풍부한 F# 콘텐츠를 확인해보세요.

Azure 소식

UWP 소식

Games 소식

주간닷넷.NET Blog에서 매주 발행하는 The week in .NET을 번역하여 진행하고 있으며, 한글 번역 작업을 오픈에스지의 송기수 전무님의 도움을 받아 진행하고 있습니다.

song 송 기수, 기술 전무, 오픈에스지
현재 개발 컨설팅회사인 OpenSG의 기술이사이며 여러 산업현장에서 프로젝트를 진행중이다. 입사 전에는 교육 강사로서 삼성 멀티캠퍼스 교육센터 등에서 개발자 .NET 과정을 진행해 왔으며 2005년부터 TechED Korea, DevDays, MSDN Seminar등 개발자 컨퍼런스의 스피커로도 활동하고있다. 최근에는 하루 업무의 대다수 시간을 비주얼 스튜디오와 같이 보내며 일 년에 한 권 정도 책을 쓰고, 한달에 두 번 정도 강의를 하면 행복해질 수 있다고 믿는 ‘Happy Developer’ 이다.

Installing Windows 10 on a PCIe M.2 SSD drive in the Surface Studio

$
0
0

This post is related to this long and hardware related blog post that I wrote on how to install a PCI2 M.2 SSD drive and a SATA SSD drive in a Surface Studio:

https://blogs.msdn.microsoft.com/cesardelatorre/2017/01/29/upgrading-surface-studio-drives-to-sata-ssd-2tb-and-m-2-ssd-1tb/

In that blog post I initially installed Windows 10 on the SATA SSD drive which is the one that by default Windows 10 has visibility about before installing Windows 10 with the RST_AHCI drivers for the PCIe M.2 SSD.

However, after chatting with Mike (The Office Maven), it is clear that it is much better to install Windows 10 on the PCIe H.2 NVMe SSD drive because it is much faster than the SATA 2.5 SSD drive.

That is possible while installing a plain Windows 10 (from a bootable Windows 10 USB flash drive) and right before selecting the disk/partition you want to install Windows into, you can load the specific drivers for the M.2 drive (Intel Chipset RAID Controller).

So, this is the procedure.

A. Download the Surface Studio drivers and locate the RST_AHCI drivers for the PCIe M.2 SSD

A.1. Download the drivers from here:

https://www.microsoft.com/en-us/download/details.aspx?id=54311

Notice that you’ll get a single .MSI setup file named SurfaceStudio_Win10_1701006_0.msi. But you need to have just the plain driver files for the M.2 SSD drive.

So, you need to extract those files from the .MSI file.

A.2. Extract the files from the SurfaceStudio_Win10_1701006_0.msi file

To extract the individual driver files from the MSI file, you can use the following msiexec command (where C:SurfaceStudioDrivers is the destination folder for the extracted files):

msiexec /a SurfaceStudio_Win10_1701006_0.msi targetdir=C:SurfaceStudioDrivers /qn

Note: When extracting driver files from the MSI, the destination folder (targetdir) must be different than the folder containing the MSI file.

Once the files have been successfully extracted, you can find driver and firmware files under the folder SurfacePlatformInstaller, found in the destination folder. For example, using this command, you would find the Surface Studio driver files in the following folder:

C:SurfaceStudioDriversSurfacePlatformInstallerSurfaceStudio_Win10_1701006_0

You can copy this whole folder to your USB Flash drive from where you will install Windows 10 (any regular bootable USB Flash drive created from a Windows 10 .ISO image).

However, the folder that you’ll need later on is the folder named RST_AHCI: which are the drivers of the “Intel Chipset SATA RAID Controller

C:SurfaceStudioDriversSurfacePlatformInstallerSurfaceStudio_Win10_1701006_0DriversSystemRST_AHCI

image

 

A.3 About creating the “generic” Windows 10 x64 UEFI bootable USB drive, you can create it from a Windows .ISO image from your legal source, like MSDN Subscriptions or any other, and with tools like  Rufus.

Now that you have the drivers copied into your “generic” bootable Windows 10 setup USB Flash drive, let’s move on to the next step.

B. Install Windows 10 from your USB bootable flash drive

B.1 Plug the UEFI bootable Windows 10 setup USB drive into any of the USB slots of your Surface Studio

B.2. Hold-down the volume-down key and while holding, press and release the power button of the Surface Studio. Doing that way, it should boot from the USB drive like in the the following screenshots.

image  image  image

B.3 Select the “Custom Installation”

image

B.5 IMPORTANT – Do not install on the presented drive, as that one is the SATA (2 TB SSD in my case)

image

 

C. Load the RST_AHCI Intel Chipset SATA RAID Controller” drivers while installing Windows 10

C.1 You want to load the RST_AHCI Intel Chipset SATA RAID Controller” drivers, so press the link “Load Driver” and search for the RST_AHCI folder in your USB drive, like in the following screens:

image   image    image

C.2 Once you select the RST_AHCI folder and press the OK button, you’ll see the controllers description:

image

Select the Intel Chipset SATA RAID Controllers and press the NEXT button.

C.3 No you can select the PCIe M.2 MNVe SSD drive!! – In my case, the 1TB SAMSUNG 960 PRO:

image

C.4 Continue/finish the Windows 10 installation:

image

D. Install the Surface Studio drivers downloaded from Microsoft’s site

D.1. You finally want to install all the Surface Studio drivers as a plain Windows 10 installation might be missing some of the Surface Studio drivers.

You probably already downloaded from here: https://www.microsoft.com/en-us/download/details.aspx?id=54311

But now, just execute the whole setup like here:

SurfaceStudioPlatformInstaller_Start

Finish the drivers installation and chech that everything is good to go, like below! Smile

Desktop_all_Config

 

You also have the possibility of re-installing the Surface Studio with the Surface Studio Windows 10 image and recovery drive that I explained in my other blog post, because now the machine is aware of the M.2 drive. However, it is basically the same, a clean Windows 10 with the Surface Studio drivers, so that extra step is up to you.


Microsoft Google dictionary

$
0
0
Microsoft speak Google speak Explanation
By design Working as intended (WAI) Bug status when the developer considers the behavior described in bug report correct
Review Perf Performance review – periodic employee performance evaluation
Autopilot Borg Datacenter management software
Live site Prod (from ‘production’) Customer visible service or web site, usually in context of support, monitoring, emergency response, capability planning etc
Ops (Operations) SRE (Site Reliability Engineer) People supporting live site / production service

Vytvorenie Azure Windows Virtual Machine z existujúceho obrazu (Azure Resource Manger)

$
0
0

Potreba vytvorenia virtuálneho servera v Azure z existujúceho obrazu môže vzniknúť pri rôznych scenároch, ako napríklad nasadzovanie špecificky nakonfigurovaného VM pre nového zákazníka, obnova servera po výpadku, vytváranie testovacieho/vývojového prostredia, migrácia existujúcej infraštruktúry atď. V rámci tohto článku si ukážeme, ako je možné zachytiť obraz virtuálneho servera spusteného v Microsoft Azure a ako z takéhoto obrazu vytvoriť virtuálny server a to všetko pre VM verzie 2 a teda virtuálne servery vytvorené prostredníctvom Azure Resource Managera (ARM).

Azure Resource Manager

Pred tým, ako popíšem tento postup, by som rád venoval niekoľko slov benefitom, ktoré ARM prináša. Zjednodušene povedané, ARM je model nasadenia Azure služieb, resp. zdrojov (resources), ktorý poskytuje kontajner pre životný cyklus aplikácií a riešení postavených na platforme Azure. Tento kontajner nazývame Resource Group (skupina zdrojov). Jednou z výhod takéhoto zoskupovania zdrojov je zjednodušenie ich správy. Odporúčaním je vytvárať jednu resource group pre všetky zdroje, ktoré tvoria riešenie alebo jeho logickú časť. Príkladom môže byť nasadenie webovej aplikácie, ktorá sa skladá z aplikačného servera, databázového servera a cache databázy, kedy všetky tieto zdroje nasadíme do jednej skupiny zdrojov. Vďaka vytvoreniu takýchto skupín je možné jednoducho sledovať finančné náklady na beh riešení ako celkov a nie len jednotlivých služieb. Na obrázku nižšie je zobrazená resource groupa so zdrojmi vytvorenými pri vytváraní virtuálneho servera.

1

 

Okrem možnosti zoskupovať zdroje, ARM prináša možnosť využívať šablóny (templates) deklarujúce stavbu jednotlivých resource group alebo zdrojov, čo je zrejme jeho najväčšou pridanou hodnotou. Tieto šablóny zjednodušujú znovu nasadenie, ktoré môže byť potrebné napríklad pri vytváraní viacerých vývojových prostredí (testing, staging, sandbox, production …) alebo nasadzovaní single tenant riešenia pre viacerých zákazníkov. Každá šablóna obsahuje:

· Parametre, s ktorými vieme meniť vlastnosti jednotlivých zdrojov, a ktorých hodnoty môžeme meniť dynamicky pred samotným nasadením. Parametre je možné definovať aj v externom súbore parameters.json (v prípade VM, parametrom môže byť napríklad názov, množstvo jadier, veľkosť pamäte atď.)

· Variables, ktoré umožňujú dinamicky zostaviť hodnoty, ktoré sa používajú v rámci šablóny. Premenné sa môžu skladať z funkcií, parametrov a konštánt.

· Definícia zdrojov (služieb) nachádzajúcich sa v Resource group (VM, Storage, Web App, databáza atď…)

· Outputs definujú hodnoty, ktoré budú vrátene po vykonaní nasadenia s využitím šablóny

Medzi zdrojmi môžeme definovať taktiež závislosti, ktoré určujú postupnosť nasadenia (napr. pri VM sa ako prvý musí vytvoriť storage účet, v rámci ktorého bude umiestnený disk s operačným systémom, až potom sa vytvorí samotný server). V prípade, že medzi zdrojmi nie sú definované závislosti, jednotlivé zdroje sa nasadzujú simultánne. Pristúpiť k json šablónam zdrojov vytvorených v modely ARM môžete cez odkaz Automation Script po otvorení Resource group alebo s využitím aplikácie Azure Resource explorer. Viac informácií k ARM a šablónam môžete nájsť v tomto blogu.

Vytvorenie Azure Windows VM z existujúceho obrazu

A teraz sa už presunieme k hlavnej téme tohto článku a síce k téme ako vytvoriť Azure Windows Virtual Machine (VM – virtuálny server) z existujúceho obrazu s využitím Azure Resource Managera (ARM). Scenárov, kedy je vhodné tento proces aplikovať, je mnoho, ale pre väčšinu z nich je hlavným menovateľom znovu použiteľnosť. Konkrétnejšie znovu použiteľnosť toho, čo sme pracne v rámci servera konfigurovali a inštalovali. Typickým príkladom môže byť prostredie pre beh našej aplikácie, ktoré chceme mať k dispozícií vo viacerých replikách. Takéto prostredie môže obsahovať napr. nainštalovaný databázový server, web sever, runtime framework, špeciálne knižnice a utility, ktoré z našej aplikácie voláme. V takomto prípade je výhodné vytvoriť si základný obraz disku, ktorý bude mať všetky tieto prerekvizity pre beh našej aplikácie nakonfigurované. ARM nám následne ponúka možnosť takýto obraz využiť a do veľkej miery automatizovať nasadenie virtuálneho servera. Okrem využitia obrazu disku pri vytváraní VM, nám ARM prostredníctvom parametrov umožňuje jednoducho meniť vlastnosti virtuálneho servera, ktorý bude vytvorený a týmto spôsobom prispôsobiť VM potrebám prostredia (pre neprodukčné prostredie zväčša potrebujeme menej výpočtových zdrojov, ako pre produkciu a pod.). Pod týmito vlastnosťami sú myslené napr. počet jadier, množstvo operačnej pamäte, verejná IP adresa, typ úložiska (štandardné alebo SSD) atď.

Pozn.: Proces zachytenia obrazu disku a následné nasadenie VM s využitím tohto obrazu bude popísané pre virtuálne servery vytvorené prostredníctvom ARM, resp. cez model nasadenia verzie 2. Avšak v prípade, že ste si vytvorili obraz disku virtuálneho servera nasadeného cez Azure Service Management (ASM, model nasadenia verzie 1/classic), na ktorom fungoval pôvodný Azure portál, môžete tento obraz využiť a preskočiť priamo k bodu č. 2 postupu uvedeného nižšie.

Pre replikáciu alebo znovu nasadenie existujúceho virtuálneho servera potrebujeme vykonať nasledujúcu postupnosť krokov:

1. Príprava VM a zachytenie obrazu

2. Prekopírovanie obrazu do storage účtu, v rámci ktorého bude hostovaný disk s operačným systémom pre VM (tento krok je potrebný iba v prípade, že pôvodný a nový virtuálny server nebudú zdieľať storage účet pre OS disky)

3. Vytvorenie ARM json šablóny a úprava parametrov pre nasadenie

4. Nasadenie VM s využitím upravenej šablóny

1. Príprava VM a zachytenie obrazu

Pred vytvorením obrazu disku s OS je potrebné tento disk pripraviť a zgeneralizovať niektoré vlastnosti, ako napríklad názov servera, bezpečnostný identifikátor (SID), vyrovnávaciu pamäť driverov atď. Na túto prípravu je potrebné pripojiť sa na virtuálny server a spustiť nástroj Sysprep.

Upozornenie: Po vykonaní Sysprepu nebude možné ďalej virtuálny server využívať. Ak potrebujete tento virtuálny server používať aj naďalej, odporúčame využiť službu Azure Backup, ktorá vám umožní server odzálohovať pred vykonaním Sysprepu a následne ho obnoviť do pôvodného stavu.

Vykonať Sysprep je možné dvoma spôsobmi a síce spustiť Sysprep utilitu s využitím PowerShell-u alebo otvorením grafického rozhrania. Obe možnosti sú priblížené na obrázkoch nižšie. Ako System Cleanup akciu je potrebné zvoliť Enter System OutofBox Experience (OOBE). Zároveň je potrebné zašktrnúť pole Generalize a ako Shutdown Option zvoliť Shutdown. Utilita je umiestnená v systémovom priečinku presnejšie system32/sysprep.

2

Powershell:

& “$Env:SystemRootsystem32sysprepsysprep.exe” /generalize /oobe /shutdown

3

 

Po spustení utility sa zobrazí dialóg informujúci o behu SysPrep-u. Po chvíli bude vaša remote session prerušená a virtuálny server sa začne vypínať. Keď sa na virtuálny server pozrieme cez prostredie Azure Portálu, uvidíme pri našom servery oznámenie, že Server je v neznámom stave (Uknown), pričom po chvíli sa toto hlásenie zmení na hlásenie, že server bol zastavený (Stopped), avšak nie dealokovaný. To znamená, že vypnutie bolo vykonané iba softvérovo na strane servera a nie na strane Azure kontroléra. Okrem toho, že sú nám stále účtované poplatky za server, to znamená, že Azure stále udržiava alokáciu disku pre daný virtuálny server. Tým pádom je disk stále “zablokovaný” a nie je k nemu možné pristúpiť. V ďalšom kroku preto potrebujeme tento disk dealokovať, aby  bolo možné zachytiť jeho obraz.

4

 

Dealokácia a zachytenie obrazu

Dealokáciu a zachytenie obrazu môžeme opäť vykonať dvoma spôsobmi a síce s využitím nástroja Azure Resource Explorer, ku ktorému môžete pristúpiť prostredníctvom tejto stránky: https://resources.azure.com/ alebo s využitím PowerShell-u.

Dealokácia a zachytenie obrazu s využitím Azure Resource Explorera:

V prvom kroku je potrebné prihlásiť sa s využitím prihlasovacích údajov, ktoré využívate pre prihlásenie sa do Azure portálu a následne je potrebné vykonať nasledovné kroky:

resource-explorer

1. V rozbaľovacom zozname v ľavej časti obrazovky najskôr vyberte subskripciu a resource group, v ktorej sa nachádza virtuálny server, ktorého obraz chcete zachytiť. Následne rozbaľte sekciu Microsoft.Compute, potom Virtual Machines a kliknite na riadok s názvom vášho servera. (Toto vyhľadanie môžete urýchliť cez vyhľadávacie okno v hornej časti, kde postačí napísať názov vášho virtuálneho servera.)

2. V hornej časti stránky zvoľte možnosť Read/Write, aby bolo možné vykonávať aj zapisovacie operácie

3. Následne kliknite na voľbu Actions (POST, DELETE).

4. Zo zoznamu akcii najskôr kliknite na tlačidlo deallocate a presuňte sa naspäť do Azure portálu k vášmu virtuálnemu serveru. Jeho stav by sa mal zmeniť najskôr na deallocating a následne na Stopped (Deallocated).

5

 

5. Akonáhle sa stav zmení na Stopped (Deallocated), môžeme sa vrátiť do Resource Explorera a pokračovať stlačením tlačidla generalize. Týmto krokom povieme Azure kontroléru, že daný server bol generalizovaný, t.z. bol nad ním spustený Sysprep.

6. Posledným krokom je zachytenie obrazu OS disku. To vykonáme stlačením tlačidla capture. Pred samotným vykonaním zachytenia je nutné nastaviť parametre v textovom poli pod tlačidlom capture a síce prefix pre názov obrazu, resp. Vhd, ktorý sa vytvorí, názov cieľového kontajneru, ktorý sa vytvorí v storage účte, v ktorom sa disk nachádza a true alebo false hodnotu pre parameter overwriteVhds, ktorým určíte, či si želáte prepísať vhd v prípade, že ste zachytenie daného disku už vykonávali. Názov kontajnera a prefix je potrebné uviesť malými písmenami.

Upozornenie: Aktuálne sa kvôli chybe vhd vždy vytvorí v rámci kontajneru System pod adresárovou cestou Microsoft.Compute-Images/nazov kontajnera .

6

 

Dealokácia a zachytenie obrazu s využitím PowerShell-u (verzie 1.0.x):

Pri využití PowerShell skriptu vykonáte totožné akcie, ako pri využití Azure Resource Explorera. Postupnosť príkazov, ktoré je potrebné vykonať je nasledovná:

#Prihlásenie do Azure a výber subskripcie
Login-AzureRmAccount
Select-AzureRmSubscription -SubscriptionId {SubscriptionId}

#Dealokácia virtuálneho servera
Stop-AzureRmVM -ResourceGroupName 'VMResourceGroupName' -Name 'VMName'

#Nastavenie generalizovaného stavu pre virtuálny server
Set-AzureRmVM -ResourceGroupName 'VMResourceGroupName' -Name 'VMName' -Generalized

#Zachytenie obrazu disku do storage účtu
Save-AzureRmVMImage -ResourceGroupName 'VMResourceGroupName' -VMName 'VMName' -DestinationContainerName 'mytemplates' -VHDNamePrefix 'templateprefix'

2. Prekopírovanie obrazu disku medzi storage účtami

Prekopírovanie je nutné spraviť v prípade, že potrebujete vytvoriť nový virtuálny server zo zachyteného obrazu, ktorého OS disk bude vytvorený v rámci iného storage účtu. Azure totiž nepodporuje scenár, pri ktorom by bol obraz disku, z ktorého sa server vytvára a samotný disk nového virtuálneho servera umiestnený v rozdielnom storage účte. Pre toto prekopírovanie sa využije utilita AZ Copy, ktorú si môžete stiahnuť z tejto stránky: https://azure.microsoft.com/en-us/documentation/articles/storage-use-azcopy/. V rámci príkazového riadku je potrebné nastaviť sa do adresára, do ktorého ste si nainštalovali utilitu AZ Copy (typicky sa utilita nainštaluje do adresára C:Program Files (x86)Microsoft SDKsAzureAzCopy) a spustite nasledovný príkaz:

AzCopy /SourceType:blob /Source:https:///{nazov zdrojoveho storage uctu}.blob.core.windows.net/system/Microsoft.Compute/Images/{moj nazov kontajnera} /Dest:https://{nazov cieloveho storage uctu}.blob.core.windows.net/{nazov cieloveho kontajnera} /SourceKey:{Pristupovy kluc k zdrojovemu storage uctu} /DestKey:{pristupovy kluc k cielovemu storage uctu} /Pattern:{nazov zachyteneho vhd suboru}.vhd

Uložte si adresu cieľového umiestnenia vhd súboru. Budete ju potrebovať pri vytváraní nového VM. Prerekvizitou pre prekopírovanie je existencia cieľového storage účtu.

Prístupové kľúče pre storage účet nájdete cez Azure portál, v rámci storage účtu, v sekcii Settings -> Access keys.

3. Vytvorenie ARM json šablóny

Po príprave obrazu disku, ktorý využijeme pri vytvorení nového virtuálneho servera, potrebujeme získať, resp. vytvoriť ARM šablónu, pomocou ktorej budeme nasadzovať nové repliky pôvodného servera. Možnosti máme dve. Prvá možnosť je, že náš nový server postavíme na šablóne získanej z existujúceho servera alebo si postavíme vlastnú ARM šablónu, ktorej základom môže a nemusí byť niektorá zo šablón dostupných v zozname Azure QuickStart Templates. V tomto texte sa zameriame na využitie a upravenie šablóny existujúceho virtuálneho servera. Túto šablónu môžeme najjednoduchšie získať cez Azure portál. Na Azure portály klikneme v ľavom menu na Resource Groups, následne zo zoznamu vyberieme našu resource groupu, v ktorej sa nachádza virtuálny server, ktorý sa chystáme znovu nasadiť. Klikneme na odkaz Automation script a následne Download.

Pozn.: šablónu je možné stiahnuť aj využitím PowerShell, Azure CLI alebo Azure Resource Explorera.

7

 

Pred tým, ako sa do tohto procesu pustíme, ešte jedno malé upozornenie. Šablóna, ktorú sme stiahli obsahuje definíciu všetkých zdrojov, ktoré sa v resource groupe nachádzajú. To znamená, že ak ste v resource groupe vytvorili aj iné služby, je potrebné tieto zo šablóny odstrániť.

V prípade, že pôvodný server bol vytvorený ešte klasickým spôsobom nasadenia, jeho ARM šablóna neexistuje a teda nie je možné využiť ju. V tomto prípade môžete šablónu vytvoriť editovaním Quick Start šablóny, ktorú si môžete stiahnuť z vyššie uvedenej stránky, alebo si môžete vytvoriť nový virtuálny server cez ARM model a následne pristúpiť k jeho šablóne a pokračovať ďalej uvedeným postupom.

Nakoľko šablónu máme v pláne využiť pre nasadenie nových serverov, potrebujeme ju vhodne poupraviť a zgeneralizovať tak, aby bola znovu použiteľná.

Azure spravil základnú parametrizáciu za nás a json súbor, ktorý sme si stiahli, by mal obsahovať zoznam základných parametrov s predvolenými hodnotami (default, okrem admin hesla). Súbor neobsahuje žiadne premenné (variables) a obsahuje definíciu 6 zdrojov nasledovných typov: Microsoft.Compute/virtualMachines, Microsoft.Network/networkInterfaces, Microsoft.Network/networkSecurityGroups, Microsoft.Network/publicIPAddresses, Microsoft.Network/virtualNetworks a Microsoft.Storage/storageAccounts (2x v prípade, že ste pre váš VM aktivovali diagnostiku). Pre každý z týchto resourcov sú definované ich vlastnosti (properties) a nastavenia. Aby sme mohli šablónu využiť pre nasadenie nového VM, potrebujeme zmeniť všetky dosadené hodnoty parametrov tak, aby vyhovovali nášmu nasadeniu. Je vhodné taktiež premenovať názvy parametrov, aby niesli všeobecné meno a nie meno viazané na server, resp. resurce group, z ktorej vznikli. Voliteľne môžete vytvoriť nové parametre, ktorými budete špecifikovať napr. veľkosť nového VM, jeho geografické umiestnenie a pod. Bližšie detaily nájdete už v uvádzanom článku. Zároveň v prípade, že pre niektoré z parametrov nezvolíte predvolenú hodnotu, bude táto hodnota od vás požadovaná počas procesu vytvárania VM. Takýmto spôsobom je možné výsledok nasadenie ovplyvniť aj mimo šablónu, až pri samotnom nasadzovaní. Hodnoty parametrov môžete zmeniť alebo doplniť aj prostredníctvom parameters.json súboru.

Aby sa náš nový virtuálny server vytvoril z pripraveného VHD súboru, je potrebné pozmeniť Storage profile v zdroji typu Microsoft.Compute/virtualMachines. Z poľa storageProfile odstránime pole imageReference a do poľa osDisk pridáme nové pole image, ktoré bude odkazovať na vhd súbor obsahujúci obraz disku, ktorý chceme využiť pri vytváraní nového virtuálneho servera. Výsledok by mal byť nasledovný:

"storageProfile": {
  "osDisk": {
    "name": "[parameters('virtualMachine_name')]",
    "createOption": "FromImage",
    "image": {
      "uri": "[concat('https', '://', parameters('storageAccount_name'), '.blob.core.windows.net', concat('/{nazov kontajnera, do ktoreho ste skopirovali image OS disku}/',parameters('virtualMachine_image_path') ))]"
     },
    "vhd": {
    "uri": "[concat('https', '://', parameters('storageAccount_name'), '.blob.core.windows.net', concat('/{nazov kontajnera kde bude vytvoreny disk pre novy virtualny server}/',parameters('virtualMachine_name'), ".vhd"))]"
    },
    "caching": "ReadWrite",
    "osType": "Windows"
  },
  "dataDisks": []
}

Všimnite si hodnotu kľúča createOption, kde sme hodnotou “FromImage” nastavili, že nový VM bude vytváraný z existujúceho obrazu. Hodnota kľúča Uri v rámci poľa vhd určuje, kde bude vytvorený disk nového VM a hodnota kľúča Uri v rámci poľa image zase určuje adresu, resp. umiestnenie vhd súboru, z ktorého sa VM bude vytvárať. Táto hodnota musí byť totožná s umiestnením, do ktorého ste skopírovali obraz disku pôvodného virtuálneho servera. V našom prípade sme si zadefinovali parameter virtualMachine_image_path.

Celá zovšeobecnená šablóna by mohla vyzerať napríklad takto:

{
  "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
  "contentVersion": "1.0.0.0",
  "parameters": {
    "virtualMachine_adminPassword": {
      "defaultValue": null,
      "type": "SecureString"
    },

    "virtualMachine_name": {
      "defaultValue": null,
      "type": "String"
    },

    "virtualMachine_image_path": {
      "defaultValue": "prefix-imagename.vhd",
      "type": "String"
    },

    "networkInterfaces_name": {
      "defaultValue": "network interface name",
      "type": "String"
    },

    "networkSecurityGroups_nsg_name": {
      "defaultValue": "network security group name",
      "type": "String"
    },

    "publicIPAddresses_ip_name": {
      "defaultValue": "public IP name",
      "type": "String"
     },

    "virtualNetworks_vnet_name": {
      "defaultValue": "virtual network name",
      "type": "String"
    },

    "storageAccount_name": {
      "defaultValue": "new storage account name",
      "type": "String"
    }
  },

  "variables": {},

  "resources": [
  {
    "type": "Microsoft.Compute/virtualMachines",
    "name": "[parameters('virtualMachine_name')]",
    "apiVersion": "2015-06-15",
    "location": "westeurope",

    "properties": {
      "hardwareProfile": {
        "vmSize": "Standard_DS3"
      },

      "storageProfile": {

        "osDisk": {
          "name": "[parameters('virtualMachine_name')]",
          "createOption": "FromImage",

          "image": {
            "uri": "[concat('https', '://', parameters('storageAccount_name'), '.blob.core.windows.net', concat('/{nazov kontajnera do ktoreho ste skopirovali image OS disku}/',parameters('virtualMachine_image_path') ))]"
           },

          "vhd": {
            "uri": "[concat('https', '://', parameters('storageAccount_name'), '.blob.core.windows.net', concat('/{nazov kontajnera kde bude vytvoreny disk pre novy virtualny server}/',parameters('virtualMachine_name'), '.vhd'))]"
          },

          "caching": "ReadWrite",
          "osType": "Windows"
        },

        "dataDisks": []
      },

      "osProfile": {
        "computerName": "[parameters('virtualMachine_name')]",
        "adminUsername": "adminUsername",

        "windowsConfiguration": {
          "provisionVMAgent": true,
          "enableAutomaticUpdates": true
         },

        "secrets": [],
        "adminPassword": "[parameters('virtualMachine_adminPassword')]"
      },

      "networkProfile": {
        "networkInterfaces": [
        {
            "id": "[resourceId('Microsoft.Network/networkInterfaces', parameters('networkInterfaces_name'))]"
        }
        ]
      }
    },

    "resources": [],

    "dependsOn": [
      "[resourceId('Microsoft.Storage/storageAccounts', parameters('storageAccount_name'))]",
      "[resourceId('Microsoft.Network/networkInterfaces', parameters('networkInterfaces_name'))]"
    ]
  },
  {

    "type": "Microsoft.Network/networkInterfaces",
    "name": "[parameters('networkInterfaces_name')]",
    "apiVersion": "2016-03-30",
    "location": "westeurope",

    "properties": {
      "ipConfigurations": [
      {
        "name": "ipconfig1",
        "properties": {
          "privateIPAddress": "10.0.0.6",
          "privateIPAllocationMethod": "Dynamic",
            "publicIPAddress": {
              "id": "[resourceId('Microsoft.Network/publicIPAddresses', parameters('publicIPAddresses_ip_name'))]"
             },

            "subnet": {
              "id": "[concat(resourceId('Microsoft.Network/virtualNetworks', parameters('virtualNetworks_vnet_name')), '/subnets/default')]"
            }
          }
        }
        ],
        "dnsSettings": {
          "dnsServers": []
        },
       "enableIPForwarding": false,
       "networkSecurityGroup": {
         "id": "[resourceId('Microsoft.Network/networkSecurityGroups', parameters('networkSecurityGroups_nsg_name'))]"
       }
     },

     "resources": [],
     "dependsOn": [
       "[resourceId('Microsoft.Network/publicIPAddresses', parameters('publicIPAddresses_ip_name'))]",
       "[resourceId('Microsoft.Network/virtualNetworks', parameters('virtualNetworks_vnet_name'))]",
       "[resourceId('Microsoft.Network/networkSecurityGroups', parameters('networkSecurityGroups_nsg_name'))]"
      ]
    },
    {
      "type": "Microsoft.Network/networkSecurityGroups",
      "name": "[parameters('networkSecurityGroups_nsg_name')]",
      "apiVersion": "2016-03-30",
      "location": "westeurope",
    "properties": {
      "securityRules": [
      {
        "name": "default-allow-rdp",
        "properties": {
          "protocol": "TCP",
          "sourcePortRange": "*",
          "destinationPortRange": "3389",
          "sourceAddressPrefix": "*",
          "destinationAddressPrefix": "*",
          "access": "Allow",
          "priority": 1000,
          "direction": "Inbound"
        }
      }
      ]
    },

    "resources": [],
    "dependsOn": []
  },
  {
    "type": "Microsoft.Network/publicIPAddresses",
    "name": "[parameters('publicIPAddresses_ip_name')]",
    "apiVersion": "2016-03-30",
    "location": "westeurope",
    "properties": {
      "publicIPAllocationMethod": "Dynamic",
      "idleTimeoutInMinutes": 4
    },
    "resources": [],
    "dependsOn": []
  },
  {
    "type": "Microsoft.Network/virtualNetworks",
    "name": "[parameters('virtualNetworks_vnet_name')]",
    "apiVersion": "2016-03-30",
    "location": "westeurope",
    "properties": {
      "addressSpace": {
        "addressPrefixes": [
          "10.0.0.0/16"
         ]
      },
      "subnets": [
      {
        "name": "default",
        "properties": {
          "addressPrefix": "10.0.0.0/24"
         }
       }
       ]
    },
    "resources": [],
    "dependsOn": []
  },
  {
    "type": "Microsoft.Storage/storageAccounts",
    "sku": {
      "name": "Premium_LRS",
      "tier": "Premium"
    },
    "kind": "Storage",
    "name": "[parameters('storageAccount_name')]",
    "apiVersion": "2016-01-01",
    "location": "westeurope",
    "tags": {},
    "properties": {},
    "resources": [],
    "dependsOn": []
  }
  ]
}

4. Nasadenie VM s využitím upravenej šablóny

Keď máme pripravenú šablónu, môžeme pristúpiť k nasadeniu VM s jej využitím. V rámci tejto sekcie si ukážeme, ako je možné vykonať toto nasadenie prostredníctvom PowerShell-u.

Nasadenie prostredníctvom PowerShell-u je možné vykonať 4 príkazmi a síce:

#Prihlasenie sa do Azure resource manager prostredia
Add-AzureRmAccount

#Vyber subskripcie
Set-AzureRmContext -SubscriptionID <YourSubscriptionId>

#volitelny krok - tymto prikazom je mozne zvalidovat spravnost sablony, cim mozte predist pripadnemu zlyhaniu pocas nasadenia
Test-AzureRmResourceGroupDeployment -ResourceGroupName <Nazov cielovej resource groupy>-TemplateFile <Cesta k ARM sablone>

#Nasadenie s vyuzitim sablony
New-AzureRmResourceGroupDeployment -Name <Nazov nasadenie> -ResourceGroupName <Nazov cielovej resource groupy> -TemplateFile <Cesta k ARM sablone>

Zvolená resource group by mala byť totožná s resource group, v ktorej ste si vytvorili storage účet, do ktorého ste prekopírovali obraz OS disku. Po spustení posledného príkazu vás PowerShell vyzve na zadanie hodnôt parametrov, ktoré v súbore nemajú uvedenú predvolenú hodnotu. Po uvedení týchto hodnôt sa začne proces nasadzovania, ktorý môže trvať niekoľko minút, pričom výsledok by mal vyzerať nasledovne:

8

 

V tomto článku sme detailne ukázali, ako je možné s využitím Azure Resource Managera vytvoriť repliku servera. Prešli sme procesom vytvorenia obrazu disku, jeho prekopírovaním do storage účtu, v rámci ktorého sa vytvorí aj OS disk novo nasadeného servera, ďalej vytvorením ARM šablóny a nasadením VM s využitím tejto šablóny. Takýto spôsob nám dokáže výrazne uľahčiť vytvorenie VM s predkonfigurovaným prostredím na základe existujúceho obrazu. V rámci tohto postupu však zostal priestor na zvýšenie miery automatizácie a síce na automatické spúšťanie procesu nasadenia VM v Azure. V ďalšom článku si ukážeme, ako toto spustenie automatizovať s využitím služby Visual Studio Team Services.

Visual Studio 2017 Release Candidate の更新を発表

$
0
0

 

本記事は、マイクロソフト本社の The Visual Studio Blog の記事を抄訳したものです。
【元記事】 Update to Visual Studio 2017 Release Candidate 2017/1/27

 

本日マイクロソフトは、Visual Studio 2017 Release Candidate の更新版をリリースしました。昨日一度、RC の更新に関する記事を投稿したことに気付いた方もいらっしゃるかもしれませんが、セットアップの問題が発生したため、いったん削除させていただきました。問題は解決しましたので、改めてご案内させていただきます。最新版を入手するには、上記のハイパーリンクをクリックするか、Visual Studio で出てくる通知をクリックしてください。

今回の更新内容の詳細については、Visual Studio 2017 のリリース ノートと既知の問題 (英語) でご確認いただけます。この記事では、重要な変更点について取り上げたいと思います。

  • .NET Core と ASP.NET Core のワークロードのプレビューが終了しました。今回の更新で複数の不具合が修正されたほか、.NET Core と ASP.NET Core のツールの操作性が向上しました。
  • チーム エクスプローラーの接続エクスペリエンスを改良し、接続したいプロジェクトやリポジトリを容易に検索できるようにしました。
  • 多くのご要望にお応えして、保存オプションの詳細設定を復活させました。
  • 複数インストールに関する問題について、ハングするなどの症状を解決しました。また、インストールが失敗したときに再試行ボタンが表示されるようにしました。スタート メニューに明確に Visual Studio のインストール メニューが表示されるようにしました。オフライン インストール用レイアウトの作成をサポートしました。

改良以外の変更点としては、Data Science と Python 開発のワークロードが削除されました。また、英語以外の言語への翻訳など、一部のコンポーネントは今回の VS リリースでサポートが終了するため、リリース要件から外れます。これらは近いうちに独立してダウンロード提供が開始されます。F# は、.NET デスクトップ ワークロードおよび .NET Web 開発ワークロードで引き続き使用できます。

最新版をお試しのうえ、ぜひフィードバックをお寄せください。問題点がありましたら、インストーラーまたは Visual Studio IDE 本体の [Report a Problem (英語)] オプションからご報告をお願いします。また、開発者コミュニティ ポータル (英語) もご利用ください。ご提案は UserVoice (英語) までお寄せいただければ幸いです。

John Montgomery (Visual Studio 担当プログラム マネジメント ディレクター)

@JohnMont は Visual Studio、C++、C#、VB、.NET、JavaScript の製品設計とユーザー サポートを担当しています。マイクロソフトには 18 年前に入社し、それ以来、開発者向けテクノロジの開発に従事しています。

Extending Dynamics 365 for Operations

$
0
0

Dynamics 365 for Operations has great support for implementing pure add-on solutions. A pure add-on solution is extending the existing functionality using non-intrusive extension points.

 

In the Dynamics AX era implementing non-intrusive solutions was near impossible. My decade old Channel 9 video on Smart customizations is recommending being as little intrusive as possible.

That is still good advice, and it turns out that we now can be significantly less intrusive. A lot of new extensibility features have been added. This series of blog posts will explore these.

Why does it matter?

When a solution is implemented using extension points only, then the solution’s cost of maintenance drops significantly:

  • The code conflicts that are typically experienced during code upgrade seize to exists.
  • The compilation time is reduced by magnitudes,
  • The package size is reduced.

Some terminology

A good citizen is referring to a well-behaved extension, that can live side-by-side with other extensions using the same extension point. The opposite is a bully, they claim the extension point for themselves, and are at risk of colliding with other solutions.

Blog posts

  1. Subscribing to events
  2. Adding new methods to existing classes, tables, forms, etc.

More to come soon – stay tuned.

 

THIS POST IS PROVIDED AS-IS; AND CONFERS NO RIGHTS.

Cloud computing guide for researchers – Azure IoT for Croke Park Smart Stadium

$
0
0

Using sensors in a busy environment, such as a sports stadium, can unveil deep insights to provide a better experience for visitors to a game, the local community, and tourists wanting to enjoy the venue. Deploying sensors, collecting data, processing it, making sense of the information, and visualising it are key challenges that researchers at Dublin City University were tasked with solving at Croke Park Stadium in Dublin, Ireland, working with Intel and Microsoft’s technical team.

croke01

The team used Azure IoT Suite to easily build an end-to-end solution gathering data from sound sensors, and a weather station, and providing analysed information into dashboards via Power BI. The system can be used to provide fans with fun experiences and encouraging the home team with louder cheering! It also helps to monitor for sound pollution, improving relationships with the local community. The weather station improves health and safety for the fantastic Etihad Skyline tour.

croke02

You can read the full story by Niall Moran here, which goes into the technical architecture and implementation details.

Viewing all 35736 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>