Quantcast
Channel: MSDN Blogs
Viewing all 35736 articles
Browse latest View live

Configuring C++ IntelliSense and Browsing

$
0
0

Whether you are creating a new (or modifying an existing) C++ project using a Wizard, or importing an project into Visual Studio from another IDE, it’s important to configure the project correctly for the IntelliSense and Browsing features to provide accurate information.  This article provides some tips on configuring the projects and describes a few ways that you can investigate configuration problems.

Include Paths and Preprocessor Macros

The two settings that have the greatest effect on the accuracy of IntelliSense and Browsing operations are the Include Paths and the Preprocessor macros.  This is especially important for projects that are built outside of Visual Studio: such a project may build without any errors, but show squiggles in Visual Studio IDE.

To check the project’s configuration, open the Properties for your project.  By default, All Configurations and All Platforms will be selected, so that the changes will be applied to all build configurations:

If some configurations do not have the same values as the rest, then you will see <different options>. If your project is a Makefile project, then you will see the following properties dialog. In this case, the settings controlling IntelliSense and Browsing will be under NMake property page, IntelliSense category:

Error List

If IntelliSense is showing incorrect information (or fails to show anything at all), the first place to check is the Error List window.  It could happen that earlier errors are preventing IntelliSense from working correctly.  To see all the errors for the current source file together with all included header files, enable showing IntelliSense Errors in the Error List Window by making this selection in the dropdown:

Error List IntelliSense Dropdown

Error List IntelliSense Dropdown

IntelliSense limits the number of errors it produces to 1000. If there are over 1000 errors in the header files included by a source file, then the source file will show only a single error squiggle at the very start of the source file.

Validating Project Settings via Diagnostic Logging

To check whether IntelliSense compiler is using correct compiler options, including Include Paths and Preprocessor macros, turn on Diagnostic Logging of IntelliSense command lines in Tools > Options > Text Editor > C/C++ > Advanced > Diagnostic Logging. Set Enable Logging to True, Logging Level to 5 (most verbose), and Logging Filter to 8 (IntelliSense logging):

Enabling the Diagnostic Logging in Tools > Options > Text Editor > C/C++ > Advanced

Enabling Diagnostic Logging in Tools > Options

The Output Window will now show the command lines that are passed to the IntelliSense compiler. Here is a sample output that you may see:

 [IntelliSense] Configuration Name: Debug|Win32
 [IntelliSense] Toolset IntelliSense Identifier:
 [IntelliSense] command line options:
 /c
 /I.
 /IC:RepoIncludes
 /DWIN32
 /DDEBUG
 /D_DEBUG
 /Zc:wchar_t-
 /Zc:forScope
 /Yustdafx.h

This information may be useful in understanding why IntelliSense is providing inaccurate information. One example is unevaluated project properties. If your project’s Include directory contains $(MyVariable)Include, and the diagnostic log shows /IInclude as an Include path, it means that $(MyVariable) wasn’t evaluated, and was removed from the final include path.

IntelliSense Build

In order to evaluate the command lines used by the IntelliSense compiler, Visual Studio launches an IntelliSense-only build of each project in the solution. MSBuild performs the same steps as the project build, but stops short of executing any of the build commands: it only collects the full command line.

If your project contains some custom .props or .targets files, it’s possible for the IntelliSense-only build to fail before it finishes computing the command lines. Starting with Visual Studio 2017 15.6, errors with IntelliSense-only build are logged to the Output window, Solution Pane.

Output Window, Solution Pane

Output Window, Solution Pane

An example error you may see is:
 error: Designtime build failed for project 'E:srcMyProjectMyProject.vcxproj',
 configuration 'Debug|x64'. IntelliSense might be unavailable.
 Set environment variable TRACEDESIGNTIME=true and restart
 Visual Studio to investigate.

If you set the environment variable TRACEDESIGNTIME to true and restart Visual Studio, you will see a log file in the %TEMP% directory which will help diagnose this error:

C:UsersmeAppDataLocalTempMyProject.designtime.log :
 error : Designtime build failed for project 'E:srcMyProjectMyProject.vcxproj',
 configuration 'Debug|x64'. IntelliSense might be unavailable.

To learn more about TRACEDESIGNTIME environment variable, please see the articles from the Roslyn and Common Project System projects. C++ Project System is based on the Common Project System, so the information from the articles is applicable to all C++ projects.

Single File IntelliSense

Visual Studio allows you to take advantage of IntelliSense and Browsing support of files that are not part of any existing projects. By default, files opened in this mode will not display any error squiggles but will still provide IntelliSense; so if you don’t see any error squiggles under incorrect code, or if some expected preprocessor macros are not defined, check whether a file is opened in Single-File mode. To do so, look at the Project node in the Navigation Bar: the project name will be Miscellaneous Files:

Navigation Bar showing Miscellaneous Files project

Navigation Bar showing Miscellaneous Files project

Investigating Open Folder Issues

Open Folder is a new command in Visual Studio 2017 that allows you to open a collection of source files that doesn’t contain any Project or Solution files recognized by Visual Studio. To help configure IntelliSense and browsing for code opened in this mode, we’ve introduced a configuration file CppProperties.json. Please refer to this article for more information.

CppProperties.json Syntax Error

If you mistakenly introduce a syntax error into the CppProperties.json file, IntelliSense in the affected files will be incorrect. Visual Studio will display the error in the Output Window, so be sure to check there.

Project Configurations

In Open Folder mode, different configurations may be selected using the Project Configurations toolbar.

Project Configurations Dropdown

Project Configurations Dropdown

Please note that if multiple CppProperties.json files provide differently-named configurations, then the selected configuration may not be applicable to the currently-opened source file. To check which configuration is being used, turn on Diagnostic Logging to check for IntelliSense switches.

Single-File IntelliSense

When a solution is open, Visual Studio will provide IntelliSense for files that are not part of the solution using the Single-File mode.  Similarly, in Open Folder mode, Single-File IntelliSense will be used for all files outside of the directory cone.  Check the Project name in the Navigation Bar to see whether the Single-File mode is used instead of CppProperties.json to provide IntelliSense for your source code.

Investigating Tag Parser Issues

Tag Parser is a ‘fuzzy’ parser of C++ code, used for Browsing and Navigation.  (Please check out this blog post for more information.)

Because Tag Parser doesn’t evaluate preprocessor macros, it may stumble while time parsing code that makes heavy use of them. When the Tag Parser encounters an unfamiliar code construct, it may skip a large region of code.

There are two common ways for this problem to manifest itself in Visual Studio. The first way is by affecting the results shown in the Navigation Bar. If instead of the enclosing function, the Navigation Bar shows an innermost macro, then the current function definition was skipped:

Navigation Bar shows incorrect scope

Navigation Bar shows incorrect scope

The second way the problem manifests is by showing a suggestion to create a function definition for a function that is already defined:

Spurious Green Squiggle

Spurious Green Squiggle

In order to help the parser understand the content of macros, we have introduced the concept of hint files. (Please see the documentation for more information.) Place a file named cpp.hint to the root of your solution directory, add to it all the code-altering preprocessor definitions (i.e. #define do_if(condition) if(condition)) and invoke the Rescan Solution command, as shown below, to help the Tag Parser correctly understand your code.

Coming soon: Tag Parser errors will start to appear in the Error List window. Stay tuned!

Scanning for Library Updates

Visual Studio periodically checks whether files in the solution have been changed on disk by other programs.  As an example, when a ‘git pull’ or ‘git checkout’ command completes, it may take up to an hour before Visual Studio becomes aware of any new files and starts providing up-to-date information.  In order to force a rescan of all the files in the solution, select the Rescan Solution command from the context menu:

Rescan Solution Context Menu

Rescan Solution Context Menu

The Rescan File command, seen in the screenshot above, should be used as the last diagnostic step.  In the rare instance that the IntelliSense engine loses track of changes and stops providing correct information, the Rescan File command will restart the engine for the current file.

Send us Feedback!

We hope that these starting points will help you diagnose any issues you encounter with IntelliSense and Browsing operations with Visual Studio. For all issues you discover, please report them by using the Help > Send Feedback> Report A Problem command. All reported issues can be viewed at the Developer Community.


Microsoft Drivers 5.2.0 for PHP for SQL Server Released!

$
0
0

Hi all,

We are excited to announce the production ready release for the Microsoft Drivers 5.2.0 for PHP for SQL Server. The drivers now support basic select/insert/update/delete functionality with the Always Encrypted feature. The driver enables access to SQL Server, Azure SQL Database and Azure SQL DW from PHP 7.0-7.2 applications on Linux, Windows and macOS.

Notable items about 5.2.0 since 4.3.0:

Added

  • Added support for Always Encrypted (see Features)
    • Support for Windows Certificate Store
    • Support for inserting into and modifying an encrypted column
    • Support for fetching from an encrypted column
  • Added support for PHP 7.2
  • Added support for Microsoft ODBC Driver 17 for SQL Server
  • Added support for Ubuntu 17 (requires Microsoft ODBC Driver 17 for SQL Server)
  • Added support for Debian 9 (requires Microsoft ODBC Driver 17 for SQL Server)
  • Added support for SUSE 12
  • Added Driver option to specify the Microsoft ODBC driver
    • Valid options are "ODBC Driver 17 for SQL Server", "ODBC Driver 13 for SQL Server", and "ODBC Driver 11 for SQL Server"
    • The default driver is ODBC Driver 17 for SQL Server

Changed

  • Implementation of PDO::lastInsertId($name) to return the last inserted sequence number if the sequence name is supplied to the function (lastInsertId)

Fixed

  • Issue #555 - Hebrew strings truncation (requires Microsoft ODBC Driver 17)
  • Adjusted precisions for numeric/decimal inputs with Always Encrypted
  • Support for non-UTF8 locales in Linux and macOS
  • Fixed crash caused by executing an invalid query in a transaction (Issue #434)
  • Added error handling for using PDO::SQLSRV_ATTR_DIRECT_QUERY or PDO::ATTR_EMULATE_PREPARES in a Column Encryption enabled connection
  • Added error handling for binding TEXT, NTEXT or IMAGE as output parameter (Issue #231)
  • PDO::quote with string containing ASCII NUL character (Issue #538)
  • Decimal types with no decimals are correctly handled when Always Encrypted is enabled (PR #544)
  • BIGINT as an output param no longer results in value out of range exception when the returned value is larger than a maximum integer (PR #567)

Removed

  • Dropped support for Ubuntu 15
  • Supplying tablename into PDO::lastInsertId($name) no longer return the last inserted row (lastInsertId)

Limitations

  • Always Encrypted is not supported in Linux and macOS
  • In Linux and macOS, setlocale() only takes effect if it is invoked before the first connection. Attempting to set the locale after connection will not work
  • Always Encrypted functionalities are only supported using MS ODBC Driver 17
  • Always Encrypted limitations
  • When using sqlsrv_query with Always Encrypted feature, SQL type has to be specified for each input (see here)
  • No support for inout / output params when using sql_variant type

Known Issues

  • Connection pooling on Linux doesn't work properly when using Microsoft ODBC Driver 17
  • When pooling is enabled in Linux or macOS
    • unixODBC <= 2.3.4 (Linux and macOS) might not return proper diagnostics information, such as error messages, warnings and informative messages
    • due to this unixODBC bug, fetch large data (such as xml, binary) as streams as a workaround. See the examples here
  • Connection with Connection Resiliency enabled does not resume properly with Connection Pooling (Issue #678)
  • With ColumnEncryption enabled, calling stored procedure with XML parameter does not work (Issue #674)
  • Cannot connect with both Connection Resiliency enabled and ColumnEncryption enabled (Issue #577)
  • With ColumnEncryption enabled, retrieving a negative decimal value as output parameter causes truncation of the last digit (Issue #705)
  • With ColumnEncryption enabled, cannot insert a double into a decimal column with precision and scale of (38, 38) (Issue #706)
  • With ColumnEncryption enabled, when fetching decimals as output parameters bound to PDO::PARAM_BOOL or PDO::PARAM_INT, floats are returned, not integers (Issue #707)

Survey

Let us know how we are doing and how you use our driver by taking our pulse survey: https://aka.ms/mssqlphpsurvey

Get Started

Direct downloads can be found on the Github release tag.

David Engel

Audit SQL Server stop, start, restart

$
0
0

In this article, Application Development Manager Steve Keeler outlines an approach for determining the domain identity of a user who has initiated a stop, start, or restart request on SQL Server services. Although SQL Server contains server and database auditing functionality as part of the product, this cannot be used to determine the identity of a user changing the service state of a SQL Server instance since that operation is occurring at the system rather than database level.


I recently worked with a customer to help resolve an issue with Team Foundation Server where collection databases were not coming online following a service restart. While working on this issue, the following question was posed: "how can we identity the user(s) responsible for stopping, starting, and restarting SQL Server services?".

Preliminary investigation into using SQL Server audit functionality yielded no solution. Checking with Microsoft database specialists, they confirmed that the closest auditing SQL Server could provide for service restarts would tie that operation to the privileged 'sa' identity. Checking with one of Microsoft's Premier Field Engineers specializing in Security, Liju Varghese provided a quick overview on how to implement service level auditing.

The following sections provide details on the settings required at the operating system level to audit service management operations, in this case for the SQL Server service. This is done using group policy objects.


Continue reading here.


Premier Support for Developers provides strategic technology guidance, critical support coverage, and a range of essential services to help teams optimize development lifecycles and improve software quality.  Contact your Application Development Manager (ADM) or email us to learn more about what we can do for you.

HoloLens RS4 Preview のインストール

$
0
0

このコンテンツは、https://docs.microsoft.com/ja-jp/windows/mixed-reality/hololens-rs4-preview のざっくり訳です。

Windows 10 の次期バージョン RS4のHoloLens 対応プレビュー版が公開されました。まだプレビュー版ですのでインストール等については自己責任でお願いします。

HoloLens RS4 Preview をダウンロードして使うことは、HoloLens RS4 Preview のエンドユーザーライセンスアグリーメント(EULA)に承認したことになります。

このプレビュー版ををインストールすると、HoloLensにあったすべてのコンテンツやアプリなどが消去され、工場出荷状態に戻ります。またプレビュー版のため何かしらのバグ等が出る可能性もあります。そのため、HoloLensを熟知し、今後更新が必要になっても問題がない方のみご利用ください。

このプレビュー版をインストールするに当たっては、最初に最新バージョンのWindows Device Reovery Tool をダウンロードし、HoloLensをWindows Insider Preview に登録する必要があります。

HoloLens RS4 Preview パッケージ

こちらをダウンロードします。中身を解凍すると2つのファイルがあります。

  • rs4_release_svc_analog.retail.10.0.17123.1004.ffu  HoloLens RS4 Preview のイメージ
  • HoloLens 2018 Preview - End User License Agreement (EULA).pdf ライセンスアグリーメント(EULA)

プレビュー版のインストール

  1. リテール版のHoloLens (Windows Holographic 10.0.14393)を起動して、デバイスに RS4 プレビューを適応する承認をInsider Preview ビルドに対して行います。
    • Settings アプリを起動して -> Update & Security -> Get Insider Preview builds -> Get started. を選択します
    • Insider Preview Build 適応のためにRestart を選択してデバイスを再起動し、再び起動するまで待ちます。
    • 何かわからないときはこちらを参照にしてください。Reset & Recovery instructions.
  2. Windows Device Recovery Tool (WDRT) をこちらからインストールします。https://aka.ms/wdrt. バージョンは 3.14.07501(もしくはそれ以降)です。
  3. Windows Device Recovery Tool を使ってOSのPreview ビルドのイメージを焼きます。
    1. Windows Device Recovery Tool をスタートメニューかデスクトップのショートカットから起動します。
      WDRT shortcut
    2. HoloLens  をUSBで接続して、Windwos Device Recovery Tool で認識したら Microsoft HoloLens を選びます。
    3.  画面下の Manual Package selection を選択して、ダウンロードしたOSのイメージの.ffu ファイルを選択します。(ダウンロードしたOSのイメージを直接焼くため )
    4. Local Package の方のバージョンが 10.0.17123.1003 (もしくはそれ以降)になっていることを確認したら、Install Software ボタンを押してOSのインストールを始めます。
    5. このプロセスはHoloLens デバイス上の中身をすべて消去する、という旨の WARNING が出ますので、承諾したら Continue ボタンを押してください。
    6. インストールは夫分かかります。その間は画面にプログレスバーが表示されます。

      ちなみに、ここではプログレースバー → Waiting for Device to boot と表示が変わりHoloLens内部は歯車が出て更新状態になります。
    7. 一旦インストールが終わったら、デバイスは再起動しますので。プロセスを終了するためにFinishボタンを押します。
    8. 新しいOSのバージョンはRecovery Toolでもう一度デバイスを選択し、Devicve Info ページから確認します。
  4. HoloLens の起動時のセットアップの中で、自分のプライベートもしくは仕事用のアカウントでサインインして新しい機能使いましょう。

新機能

  • 起動時の2D、3Dコンテンツの自動配置(起動時に1タップ配置が不要に)
  • Adjustモードでなくてもウィンドウの移動や回転、リサイズが可能に
  • 2Dウィンドウの横方向だけの拡大が可能に
  • 音声コマンドの拡張(Go to Start, Move this)
  • Holograms と Photos のアプリがアップデート
  • Mixed Reality Capture の改善(音量のUP/DOWN同時長押し3秒で録画開始)
  • オーディオの改善
  • HoloLensの中でファイルエクスプローラーが使えます
  • デスクトップPCから、HoloLensのフォト、ビデオ、ドキュメントへのアクセスが楽に
  • セットアップ中での、ブラウザを使った無線LANの認証に対応

開発者向け更新

  • Spatial Mappingの改善
  • 深度バッファを使ったフォーカスポイントの自動選択
  • Holographic Reprojection を停止可能に
  •  アプリがHoloLens 上で動いているか、MRデバイス上で動いているかがAPIでより詳細にわかる

企業向け新機能等

  • マルチAzure ADアカウントを利用可能に
  • サインイン時にWiFiネットワークを変更可能に
  • 個人のMicrosoft アカウントに企業カウントをより簡単に追加可能に
  • MSM環境がなくてもメールのシンクが可能に

IT Pro 向け情報

  • Commercial Suite の新しいOSめいは Windows Holographic for Bussiness
  • セットアップを設定可能に(初期設定時のキャリブレーションとを隠せる)
  • Windows Ocnfigulation Designer
  • バルクAzure ADトークンのサポート
  • Developer CSP向けのパッケージのプロビジョニングを作成可能に
  • キオスクモード用のアサインドアクセス
  • セットアップのログ情報の取得
  • ローカルアカウントのパスワード期限切れへの対応緩和
  • MDM同期状況と詳細の改善

既知の問題

  • Windows Insider Program 設定の問題が何名かから報告されている。もし問題が起きたら、フィードバックハブの中でバグ情報を取得し、再度デバイスを焼き直してください。

開発者の方へ

フィードバックと問題の報告のお願い

  • HoloLens内のフィードバックハブを通じてフィードバックや問題の報告をお願いします。フィードバックハブを使って内部の詳細情報も報告できるので、問題の早期解決につながります。
  • 尚、フィードバックハブがドキュメントフォルダを使う許可が求められるので「Yes」を選択してあげてください。

質問とサポート

devconf’s log #3 – Sessions, Cubes, and lots of collaboration

$
0
0

Continued from devconf’s log #2 – Sun, Mandela, and no Bowtie. After lots of meetings, discussions, and content preparations we finally completed the devconf event in Johannesburg. I’ll share a lot more photos and the feedback from todays sessions soon.

Here are a few pictures to give you a snippet of a phenomenal day of collaboration and sessions.

A great start to a new day, with a death-star like walk to the event
DSCN4150
DSCN4155DSCN4149DSCN4152

Johannesburg program
DSCN4141DSCN4143

Lots of swag … to mix-up the cube or not
DSCN4144DSCN4145DSCN4139DSCN4163

It was great to mingle with the phenomenal BBD engineers … even if one gives speeches bare foot Smile
DSCN4156
DSCN4159DSCN4157DSCN4158

Ending the day with another speaker dinner and great discussions
DSCN4146
DSCN4161DSCN4162

Tomorrow we’re off to Cape Town for the next devconf day. It will be a blast. I’m stoked!  Watch the space.

Get Community driven Docker images for Web app for Containers

$
0
0

You can now find community driven docker images to try out on Web App for Containers on Github. These images follow best practices for Web app for containers , contain SSH for debugging purposes.

How to deploy

These docker images can be found on Docker hub on hub.docker.com/r/appsvcorg.  Use the latest tag for the most recent version of the image when deploying it to Web app on Azure.

To deploy your application using these community images , follow the steps below

  • Login to Azure portal
  • Create a new web app from Web app for Containers template 
  • Under configure container , select
    • Image source as Docker Hub
    • Repository access as public
    • Enter docker image name in this format : appsvcorg/django-python:0.1 or  appsvcorg/django-python:latest 

  • Click on Create to start the deployment
  • View the Readme.md for the docker image you have selected on github if there are additional configurations to be made after the web app is created.

How to contribute

To make sure your docker image is included , please follow these guidelines. Any docker image that are out of compliance will be added to the blacklist and be removed from Docker hub repository https://hub.docker.com/r/appsvcorg .

Here is the end to end process flow for contributing to this docker hub repository .

When you contribute a new docker image , as the owner of that docker image your responsibilities include:

  • review issues reported on the docker image on Github
  • fix and resolve bugs or compliance issues with the docker image
  • keep the docker image up to date

Get started on how to contribute to the Github repository.

How to remove a docker image from Docker hub

Docker images can be removed from Docker hub and Github repository  when either if the two cases below is applicable :

  • If the owner/primary maintainer of the docker image does not wish to maintain the docker image and remove it since they no longer support it. Please report it to appgal@microsoft.com to remove the docker image from Docker hub and Github repositories .
  • If docker image is outdated , or is has bugs unresolved for more than 3 months the docker image will be removed .

How to report issues

If you want to report issues , please report an issue here. Provide all the necessary information as shown in this template to report an issue in order for us to help with resolving the issue.

 

 

Unit Testing Your JavaScript Code

$
0
0

In a recent post from his blog, Premier Developer Consultant Jim Blizzard discusses how to set up Visual Studio 2017 to run JavaScript-based unit tests.


This week, I demonstrated to a client how they could write unit tests in JavaScript to test their JavaScript code by leveraging Karma, Jasmine, and Chutzpah. The unit tests show up in Test Explorer just like unit tests written in C# do. Setting things up isn’t very difficult and can be completed in just a few minutes.

Let’s take a look at how you can do it in your environment while we start to create a JavaScript library that calculates the score of a bowling game.

Continue reading more on Jim’s blog post.

Important Updates About Us and Access Control Service

$
0
0

UPDATE: Access Control Service will be deprecated on November 7, 2018 and the ability to create new Access Control Service namespaces will be stopped on May 1, 2018 as part of this deprecation process. This includes previously whitelisted user Subscription IDs. 

 

For additional information please take a look at this blog post.

 

For Service Bus, Event Hubs, and Relay customers that currently use Access Control Service (ACS) namespaces you can get guidance on how to migrate with the following articles:

 

Other migration guidance about ACS can be found here.

 

We highly suggest you begin planning and executing on a migration strategy today.


PIX 1803.25 – GPU Occupancy, CPU sampling, automatic shader PDB resolution, and more

$
0
0

Today we released PIX 1803.25 which includes numerous new and updated features:

  • GPU Occupancy provides detailed insight into how shader workloads execute on the GPU. As announced at GDC we have collaborated with NVIDIA to bring console-level performance details about how shaders execute on the hardware to PIX on Windows. Many thanks to our partners at NVIDIA for helping us enable this great feature in PIX on Windows.
    • While the Execution Duration timeline lane shows the total execution time for draws and other rendering operations, the new Occupancy lane shows the associated work as it moves through the GPU’s rendering pipeline. You’ll see how work is broken into vertex shader work, pixel shader work, and so forth giving you a much more accurate view of how the hardware handles your rendering.
    • The Occupancy lane shows VS, HS, DS, GS, PS, and CS work corresponding to the different pipeline stages as well as a stage labeled Internal which allows GPU vendors to account for work that doesn’t map to any of the conventional pipeline stages.
    • To collect GPU occupancy data you have to Collect Timing Data first. Once the timing data is available, click the Enable button in the Occupancy lane to collect the data and populate the view.
    • This feature is currently only available on NVIDIA GPUs and it requires an updated driver. Please make sure to get version 391.35 or later to use this feature.
    • We’re working on surfacing this information for other GPUs as well, so stay tuned for updates.
  • Timing Captures now include an option to collect and analyze CPU samples.
    • PIX now includes a CPU sampling profiler that can optionally be run when taking a timing capture. Viewing CPU samples is useful for determining what code is running on a thread or core for portions of your title that either have been sparsely instrumented with PIX events or not instrumented at all. The integration of samples into timing captures improves the iteration time of your profiling tasks because you can now get a sense of what code is executing at any point in time without having to go back and add additional instrumentation to your title.
  • Timing captures now include the ability to visualize the execution times of individual functions in your title.
    • Timing captures now allow you to track the execution of your title’s functions. You can select which functions to track by viewing callstacks for samples, by analyzing an aggregate view of the samples, or by selecting functions from your title’s PDBs. Each call to the functions you track is displayed on an additional lane per thread (or core) in the capture. As with the ability to collect CPU samples, the ability to track functions improves your iteration time by drastically reducing the need to add instrumentation to your title, rebuild, and redeploy.
  • Automatic shader PDB resolution. We’ve improved how PIX resolves shader debug information (shader PDBs) and described the process for how you can set up your build system to support this. For more information see the documentation for this feature.
    • Your build system can use a suggested, unique name for each shader’s PDB data, and can then strip and store the debug information in a dedicated file. If you follow this process PIX can automatically discover and resolve your shader PDBs and thus provide much better support for inspecting, editing, and debugging shaders.
  • The GPU Memory Usage tab in timing captures now tracks Pipeline States, Command Allocators, and Descriptor Heaps as well.
  • Improved TDR debugging support.
    • We have fixed issues for titles using ExecuteIndirect.
  • Improvements to Pipeline view and resource history.
    • The Pipeline view is now much faster to use. PIX tracks resource access for the entire capture the first time you access the Pipeline or Resource History views. Once this is done these views update near-instantaneously when selecting a new event.
    • Resource history now accurately reflects which bound resources were accessed by shaders.
    • Shader access tracking now also supports DXIL shaders.
  • The Shader Debugger now supports debugging of geometry shaders.
  • Support for buffered IO in the File IO profiler
    • The file IO profiler now displays file accesses from your title even if they’ve been satisfied by the Windows disk cache. A new IO Event Type column in the event list lets you choose between viewing buffered events, non-buffered events or both.
  • DirectX Raytracing (DXR) support. This release also adds enhancements to the support for the new experimental DXR features we released to support the recent GDC announcement. Please make sure to check the dedicated release note and documentation for details. New features in this release:
    • You can now view samplers for DispatchRays calls.
    • Fixed global root parameters for DispatchRays calls when a graphics PSO is bound.
  • Fixed several bugs. Thanks to all of you out there who reported issues!

As always, please let us know if you have feedback on any of the features in PIX on Windows.

GPU occupancy
GPU occupancy view

The screenshot above shows how PIX on Windows visualize GPU occupancy.

  1. Vertex shader work for this event.
  2. Pixel shader work for events other than this one.
  3. Pixel shader work for this event.

CPU sampling
CPU sampling in timing capture

Leveraging the new Time Travel Trace API in Debugging tools to find when one or more SharePoint event happened

$
0
0

In my previous post, I showed a proof-of-concept script to list all occasions a process opened a file. JavaScript is easy to program and works for most cases, however in some occasions you need to access resources not available from JavaScript and only a full-fledged debugging extension will do. In this post I will show some highlights of a debugging extension using the new Time Travel Debugging and the new API which at the time of writing was still in preview and may change before final release. The extension contains some other commands as I wrote this for training purposes and decided to leave "as is", so if you are also only interested in writing your extension, you can get some ideas from it. The command for Time Travel Trace (TTD) is "!idnauls". The idea is to list all occasions when SharePoint logs something and show the time, event id (tag), TTD position, category, severity and message of the log entries. It can also filter by tag, category and message. The logic behind this is explained on a previous post. This command only works with TTD dumps (.run) and it requires the new debugging tool which is also in preview version. There is no need to use this extension for other end than learning. NetExt includes a similar command (!widnauls) for this purpose. Special thanks to Ken Sykes and Bill Messmer for the help with new API.

Download project here.

The command to enumerate all calls to SharePoint log

Knowing that most ULS log is done by calling these two export functions "onetnative!ULSSendFormattedTrace", "Microsoft_Office_Server_Native!ULSSendFormattedTrace", and that tag ai108 translates to 0x21b6a2 per my previous post, this command would do the trick:

dx /g @$cursession.TTD.Calls("onetnative!ULSSendFormattedTrace", "Microsoft_Office_Server_Native!ULSSendFormattedTrace").Where(c => c.Parameters[0] == 0x21B6A2).Select(c=> new { param1 = c.Parameters[0], param2 = c.Parameters[1], param3 = c.Parameters[2], param4 = c.Parameters[3], start = c.TimeStart, sequence = c.TimeStart.Sequence, steps = c.TimeStart.Steps } )

Parameters

  • Parameters[0] – Contains the tag in numeric form and is used to filter
  • Parameters[1] – Contains the id of the product/category. This information is in SPLogLevel object and I used this snippet to get the code id and description:

Add-PSSnapin Microsoft.SharePoint.Powershell -ErrorAction SilentlyContinue

$logLevels = (Get-SPLogLevel | Sort Id)

$lines = New-Object 'System.Collections.Generic.List[string]';

foreach($level in $logLevels)

{

$lines.Add("`tcatMap[0x$($level.Id.ToString(""x"))]=""$($level.Area)|$($level.Name.Replace(""|"", "":""))"";");

Write-Host "`tcatMap[0x$($level.Id.ToString(""x"))]=""$($level.Area)|$($level.Name.Replace(""|"", "":""))"";";

}

$lines | Out-File spcat.txt

  • Parameters[2] – Contains the severity which is defined here.
  • Parameters[3] – Contains the pointer to the message string. There is a caveat here. A TTD string is only resolved when the context moves to the time of occurrence.

Logic in a nutshell

  • Run the command to list all times when one of the two functions are called.
  • Go to each position and retrieve the message since it can only be read when the context is moved to the position.
  • Calculate the current time (at target machine).
  • Transform the id of product/category into string using a simple table (this does not require moving the context)
  • Transform the severity id into string (also no context change)
  • Print the result with the TTD position

Extension command code

#define
IfFailedReturn(x) if(FAILED(x)) return
E_FAIL;
void DisplayFound(std::string &FullMessage, std::string& Part)
{
    if (Part.size() == 0)
    {
        g_ExtInstancePtr->Out("%s", FullMessage.c_str());
        return;

    }

    size_t i = 0;

    size_t p = 0;

    while (i < FullMessage.size() - Part.size())

    {

        p = FullMessage.find(Part, i);

        if (p == std::string::npos)

        {

            g_ExtInstancePtr->Out("%s", FullMessage.substr(i).c_str());

            break;

        }

        g_ExtInstancePtr->Out("%s", FullMessage.substr(i, p - i).c_str());

        g_ExtInstancePtr->Dml("<col fg="wbg" bg="srccmnt">%s</col>", Part.c_str());

        i = p+Part.size();

    }


}

HRESULT MoveTo(IModelObject *spStart)

{

    //

    // SeekTo is a key on the object just like anything else. The value of the key is a method.

    //

    CComPtr<IModelObject> spSeekToMethod;

    IfFailedReturn(spStart->GetKey(L"SeekTo", &spSeekToMethod, nullptr));

    //

    // Before we arbitrarily go about using it as a method, do some basic validation.

    //

    ModelObjectKind mk;

    IfFailedReturn(spSeekToMethod->GetKind(&mk));

    if(mk != ObjectMethod)

    {

        return
E_FAIL;

    }

    //

    // ObjectMethod indicates that it is an IModelMethod packed into punkVal. You can QI to be extra

    // safe if desired.

    //

    VARIANT vtMethod;

    IfFailedReturn(spSeekToMethod->GetIntrinsicValue(&vtMethod));

    //ASSERT(vtMethod.vt = VT_UNKNOWN); // guaranteed by ObjectMethod

    CComPtr<IModelMethod> spMethod; // or whatever mechanism you want to guarantee the variant gets cleared. variant_ptr, …

    spMethod.Attach(static_cast<IModelMethod *>(vtMethod.punkVal));

    //

    // Call the method (passing no arguments). The result here is likely to be ObjectNoValue (there is no return value).

    //

    CComPtr<IModelObject> spCallResult;

    IfFailedReturn(spMethod->Call(spStart, 0, nullptr, &spCallResult, nullptr));

    return
S_OK;

}

EXT_COMMAND(idnauls,

    "Command to list ULS position and tag and can be filtered by message or category",

    "{nomessage;b,o;;Do not show the ULS log message (faster processing).}"

    "{tag;s,r;;Tag to search for (e.g.: -tag b4ly). Use * for all tags. Required}"

    "{category;b,o;;Search text in Category or Product and not in message (e.g. -category Claims). Faster processing. Severity not searched}"

    "{message;b,o;;Search text in message and not in category or product (e.g. -message disk is full). Slower processing.}"

    "{;x,o;;Optional filter for message (-message) or category (-category) (e.g.: -message disk is full). Must be the last parameter}")

{

    wasInterrupted = false;

    UINT64 startTime, endTime;


    std::string tag = GetArgStr("tag");

    std::string mess;

    if (HasUnnamedArg(0))

        mess = GetUnnamedArgStr(0);

    bool nomess = HasArg("nomessage");

    bool message = HasArg("message");

    bool catonly = HasArg("category");

    if (catonly && message)

    {

        Out("Error: You have to use either -category or -message. Never bothnn");

        Out("No search was performedn");

        return;

    }

    if (message && tag == "*")

    {

        Out("Warning: When you combine -tag * and -message, it creates a very inefficient query.n");

        Out(" -tag * will retrieve all ULS log entry and -message will require to move to an iDNA possition every time.n");

        Out(" -tag <tag> will only retrieve the ULS logs with this tag and then move to the position to retrieve the message.n");

        Out("Information: Notice that -tag * and -category is ok and still very fast as category is also filtered without moving to the position.nn");

    }

    if ((catonly || message) && mess.size() == 0)

    {

        Out("Error: -category or -message require a filter patternn");

        Out("Example: !idnauls -tag ag9cq -message User was authentticatedn");

        Dml("3dbba3:1254    SharePoint Foundation    Claims Authentication    High    ag9cqt<b>User was authenticated</b>. Checking permissions.nn");

        Out("Example: !idnauls -tag * -category Web Content Managementn");

        Dml("3de9ae:14c0    <b>Web Content Management</b>    Publishing Cache    High    ag0ldn");

        Out("No search was performedn");

        return;

    }

    map<int, int> catVector;

    if (catonly)

    {


        catVector = SPCategories::GetListAreaName(mess);

        if (catVector.size() == 0)

        {

            Out("No category/product contains '%s'n", mess.c_str());

            Out("No search was performedn");

            return;

        }

        Out("Warning: The string '%s' will only be searched on Category/Product, not in messagen", mess.c_str());

        mess.clear();

    }

    if (tag != "*" && (tag.size() < 4 || tag.size() > 5))

    {

        Out("Tag: '%s' is invalidn", tag.c_str());

        Out("It can be either '*' for all or be between 4 and 5 bytesn");

        return;

    }

    unsigned
int tagBin = StrToTag(tag);

    if (tagBin == 0 && tag != "*")

    {

        Out("Tag: '%s' is invalidn", tag.c_str());

        Out("It does not contain a valid tag sequencen");

        return;

    }

    if (tag == "*")

    {

        swprintf_s(Buffer, MAX_MTNAME, L"@$cursession.TTD.Calls("onetnative!ULSSendFormattedTrace", "Microsoft_Office_Server_Native!ULSSendFormattedTrace").Select(c=> new { param1 = c.Parameters[0], param2 = c.Parameters[1], param3 = c.Parameters[2], param4 = c.Parameters[3], start = c.TimeStart, sequence = c.TimeStart.Sequence, steps = c.TimeStart.Steps } )");

    }

    else

    {

        swprintf_s(Buffer, MAX_MTNAME, L"@$cursession.TTD.Calls("onetnative!ULSSendFormattedTrace", "Microsoft_Office_Server_Native!ULSSendFormattedTrace").Where(c => c.Parameters[0] == 0x%p).Select(c=> new { param1 = c.Parameters[0], param2 = c.Parameters[1], param3 = c.Parameters[2], param4 = c.Parameters[3], start = c.TimeStart, sequence = c.TimeStart.Sequence, steps = c.TimeStart.Steps } )", tagBin);

    }

    std::wstring query(Buffer);

#if
_DEBUG

    Out("%s = ", tag.c_str());

    Out("%x    n", tagBin);

    Out("dx %Sn", query.c_str());

#endif

    CComPtr<IHostDataModelAccess> client;

    HRESULT Status;

    REQ_IF(IHostDataModelAccess, client);

    CComPtr<IDebugHost> pHost;

    CComPtr<IDataModelManager> pManager;

    if (FAILED(client->GetDataModel(&pManager, &pHost)))

    {

        Out("Data Model could not be acquiredn");

        return;

    }

    CComPtr<IDebugHostEvaluator2> hostEval;

    CComPtr<IModelObject> spObject;

    pHost->QueryInterface(IID_PPV_ARGS(&hostEval));

    startTime = GetTickCount64();

    if (!SUCCEEDED(hostEval->EvaluateExtendedExpression(USE_CURRENT_HOST_CONTEXT, query.c_str(), nullptr, &spObject, nullptr)))

    {

        Out("Expression could not be evaluatedn");

        return;

    };

    CComPtr<IModelObject> pListOfBreaks;

    CComPtr<IIterableConcept> spIterable;

    if (SUCCEEDED(spObject->GetConcept(__uuidof(IIterableConcept), (IUnknown**)&spIterable, nullptr)))

    {

        CComPtr<IModelIterator> spIterator;

        if (SUCCEEDED(spIterable->GetIterator(spObject, &spIterator)))

        {

            //

            // We have an iterator. Error codes have semantic meaning here. E_BOUNDS indicates the end of iteration. E_ABORT indicates that

            // the debugger host or application is trying to abort whatever operation is occurring. Anything else indicates

            // some other error (e.g.: memory read failure) where the iterator MIGHT still produce values.

            std::vector<UlsInstance> queryResult; // It will store the list of parameters

            UINT32 Index = 0;

            //

            for (;;)

            {

                CComPtr<IModelObject> pBreakItem;

                CComPtr<IKeyStore> spContainedMetadata;

                HRESULT hr = spIterator->GetNext(&pBreakItem, 0, nullptr, &spContainedMetadata);

                if (hr == E_BOUNDS || hr == E_ABORT)

                {

                    break;

                }

                if (FAILED(hr))

                {

                    Out("There was a failure at an Itemn");

                    continue;

                    //

                    // Decide how to deal with failure to fetch an element. Note that pBreakItem *MAY* contain an error object

                    // which has detailed information about why the failure occurred (e.g.: failure to read memory at address X).

                    //

                }

                //

                // Read the values

                //

                CComPtr<IModelObject> tag;

                CComPtr<IModelObject> SevLevel;

                CComPtr<IModelObject> Category;

                CComPtr<IModelObject> Message;

                CComPtr<IModelObject> Start;

                CComPtr<IModelObject> Sequence;

                CComPtr<IModelObject> Steps;

                VARIANT vt_tag, vt_sevlevel, vt_category, vt_message, vt_sequence, vt_steps;

                if (FAILED(hr = pBreakItem->GetKeyValue(L"param1", &tag, NULL
/* &spContainedMetadata */))) { Out("Error reading param1"); continue; }

                hr = tag->GetIntrinsicValue(&vt_tag);

                //hr = tag->GetIntrinsicValueAs(VT_INT_PTR, &vt_tag);

                if (FAILED(hr = pBreakItem->GetKeyValue(L"param2", &Category, NULL
/* &spContainedMetadata */))) { Out("Error reading param2"); continue; };

                hr = Category->GetIntrinsicValue(&vt_category);

                if (FAILED(hr = pBreakItem->GetKeyValue(L"param3", &SevLevel, NULL
/* &spContainedMetadata */))) { Out("Error reading param3"); continue; };

                hr = SevLevel->GetIntrinsicValue(&vt_sevlevel);

                if (FAILED(hr = pBreakItem->GetKeyValue(L"param4", &Message, NULL
/* &spContainedMetadata */)))

                {

                    Out("Error reading param4");        continue;

                };

                hr = Message->GetIntrinsicValue(&vt_message);

                if (hr = FAILED(pBreakItem->GetKeyValue(L"start", &Start, NULL
/* &spContainedMetadata */)))

                {

                    Out("Error reading start"); continue;

                };

                //hr = Start->GetIntrinsicValue(&vt_start); // It fails here because the type in ObjectSynthetic

                if (FAILED(hr = pBreakItem->GetKeyValue(L"sequence", &Sequence, NULL
/* &spContainedMetadata */)))

                {

                    Out("Error reading sequence");        continue;

                };

                hr = Sequence->GetIntrinsicValue(&vt_sequence);

                if (FAILED(hr = pBreakItem->GetKeyValue(L"steps", &Steps, NULL
/* &spContainedMetadata */)))

                {

                    Out("Error reading steps");    continue;

                };

                hr = Steps->GetIntrinsicValue(&vt_steps);

                if (IsInterrupted())

                {

                    break;

                }

                UlsInstance obj;

                obj.Category = vt_category.uintVal;

                obj.Message = vt_message.llVal; //vt_category.llVal);

                obj.SevLevel = vt_sevlevel.uintVal;

                obj.Sequence = vt_sequence.uintVal;

                obj.Steps = vt_steps.llVal;

                obj.tag = vt_tag.uintVal;

                obj.Index = Index;

                if (catVector.size() > 0)

                {

                    if (catVector.find((int)obj.Category) == catVector.end())

                    {

                        continue;

                    }

                }

                // Only move if necessary

                bool show = true;

                string fullMess;


                if ((mess.size() > 0 || !nomess) && SUCCEEDED(MoveTo(Start)))

                {

                    CComBSTR stringConv;

                    CComPtr<IDebugHostContext> context;

                    CComPtr<IDebugHostMemory> memory;

                    if (SUCCEEDED(hr = pHost->QueryInterface(__uuidof(IDebugHostMemory), (void**)&memory)))

                    {

                        ULONG64 BytesRead = 0;

                        Location loc;

                        loc.HostDefined = 0;

                        loc.Offset = obj.Message;

                        if (SUCCEEDED(hr = memory->ReadBytes(USE_CURRENT_HOST_CONTEXT, loc, Buffer, MAX_MTNAME * 2, &BytesRead)))

                        {


                            Buffer[MAX_MTNAME - 1] = L'';

                            fullMess = CW2A(Buffer);

                            if (mess.size() > 0)

                            {

                                show = fullMess.find(mess) != std::string::npos;

                            }

                        }

                    }

                    if (hr != S_OK)

                    {

                        fullMess = "*** Unable to read memory ***";

                        show = true;

                    }

                }

                if (show)

                {

                    Index++;

                    Dml("<link cmd="!tt %S">%S</link>t", obj.IDnaPosition().c_str(), obj.IDnaPosition().c_str());

                    string area;

                    string prod;

                    string sev = SPCategories::GetSevLevel(static_cast<int>(obj.SevLevel));

                    SPCategories::GetAreaName(obj.Category, area, prod);

                    if (!nomess)

                    {

                        SYSTEMTIME time;

                        if (!GetTime(time, true))

                        {

                            Out("??/??/???? ??:??:??.??t");

                        }

                        else

                        {

                            Out("%02i/%02i/%04i %02i:%02i:%02i.%02it", time.wMonth, time.wDay, time.wYear, time.wHour,

                                time.wMinute, time.wSecond, time.wMilliseconds / 10);

                        }

                    }

                    Out("%st", area.c_str());

                    Out("%st", prod.c_str());

                    Out("%st", sev.c_str());

                    Out("%st", TagToStr(obj.tag).c_str());


                    if (!nomess)

                    {

                        if (mess.size() > 0 && !catonly)

                            DisplayFound(fullMess, mess);

                        else

                            Out("%s", fullMess.c_str());

                    }

                    queryResult.push_back(obj);

                    Out("n");

                }

            }

            Out("%u Instancesn", Index);

            endTime = GetTickCount64();

            Out("Search took %f secondsn", (float)(((float)endTime - (float)startTime) / (float)1000));

        }

    }

}

 

 

Example 1 – Looking for a particular tag and message

Example 2 – Listing all instances

Power BI Tricks, Tips and Tools from the owners of PowerBI.Tips Mike Carlo and Seth Bauer

$
0
0

Power BI Tricks, Tips and Tools from the owners of PowerBI.Tips

Power Tricks, Tips and Tools from the owners of PowerBI.Tips
In this very special webinar the owners of PowerBI.Tips and Power BI MVPs, Seth Baur and Mike Carlo will share with you their huge grab bag of Power Tricks, Tips and Tools they have published to http://PowerBI.Tips over the last 18 months.
Demo’s to include their theme generator, adding data types within the query editor and their latest offering Power BI layouts (and a tour of their latest layout “Cool Blue”).

When 3/28/2018 10AM PST

Where: https://www.youtube.com/watch?v=fnj1_e3HXow

Experiencing Data Access Issue in Azure and OMS portal for Azure log Analytics- FairFax – 03/27 – Investigating

$
0
0
Initial Update: Tuesday, 27 March 2018 23:54 UTC

We are aware of issues within NPM data in OMS portal and Fairfax Azure portal for Azure Log analytics and are actively investigating. All customers may experience  issue while accessing NPM data in OMS portal and Azure portal.

The following data types are affected: NPM data in OMS portal and  Azure Portal.
Work Around:  None

  • Next Update: Before 03/28 03:00 UTC

We are working hard to resolve this issue and apologize for any inconvenience.
-Rahim

Experiencing Data Access Issue in Azure and OMS portal for – 03/28 – Resolved

$
0
0
Final Update: Wednesday, 28 March 2018 01:22 UTC

We've confirmed that all systems are back to normal with no customer impact as of 01:22 UTC. Our logs show the incident started on  03/27/2018 13:59 PDT and that during the 3h 33m hours that it took to
resolve the issue, all of the customers experienced the impact in Fairfax
region with no latest NPM data being reflected in the NPM dashboard.


Root
Cause:        The issue was because of a
certificate mismatch between NPM service and InMem service. During a recent
deployment of InMem the cert got rotated to a new certificate and then we
started seeing the exceptions at the ingestion.


Lessons
Learned:             
Any cert rotations should be informed to oms partners and there should be a
backup cert that should kick in if the first one is failing. We will be
investigating a proper resolution between both the teams( NPM and InMem ) on having
a cert rotation procedure to and other possible implementation scenarios of
sharing storage blob information.


Incident
Timeline: 3hrs & 1 minute


 

We understand the customers rely on Network
Performance Monitoring
as a critical service and apologize for any impact
this incident has caused


-Rahim


blogpost_inytc

Microsoft Imagine Cup 2018 – Regional Final Schedule

$
0
0

Dear Imagine Cup Participants

 

Thank you for your submission. We have come up with a presentation schedule for teams who are interested in going to one of our 4 onsite locations to present their projects, in 15 min slots. Below you will find this schedule for every team that has registered.

 

Please, if you find your team name mentioned in a region that is too far for you (you are from Islamabad or Lahore, and region assigned to you is Karachi for example) get in touch with Muhammad Sohaib (v-musoha@microsoft.com / 0343-3555716) ASAP so that you can be assigned a new location.

 

Regional contact names for each region, the location for each region’s along with the regional host locations, and there details are also given below against each region.

 

In case of any general queries or concerns, please reach out immediately to Muhammad Sohaib at contact details provided. For region specific queries, please reach out to regional contacts given below. Please note, these timings are for ONSITE presentations. If a team is not able to come for onsite presentation, their online submitted video and other deliverables will be judged and marked accordingly.

Note: If you don't see your team listed below, that means your submission was incomplete. Please get in touch with Sohaib, we are looking at how we can entertain such teams at this point in time. Please, when you reach out, MAKE SURE YOU LET US KNOW WHICH REGION / UNIVERSITY you belong to.

Contact Names & Host Locations:

 

Karachi:

Higher Education Regional Center Karachi – Muhammad Sohaib: 0343-3555716 / v-musoha@microsoft.com

 

Lahore:

Higher Education Regional Center Lahore – Sheikh Rizwan: 0312-5166755 / v-shrizw@microsoft.com

 

Peshawar:

Pearl Continental Hotel, Peshawar – Sheikh Rizwan: 0312-5166755 / v-shrizw@microsoft.com

 

Quetta:

Serena Hotel, Quetta – Muhammad Sohaib: 0343-3555716 / v-musoha@microsoft.com

 

Team List & Schedule:

 

Karachi:

Team Name Regional Final Region Timings Date
3 Coders Karachi 10:15am 29th March
Ali Ahmed Karachi 10:30am 29th March
BEAMS Karachi 10:45am 29th March
E-Henna Karachi 11:00am 29th March
Faaiz-ul-Hassan Karachi 11:15am 29th March
Factotum Karachi 11:30am 29th March
Hammad ur Rehman, Sohaib Nadeem, Nazneen Kausar Karachi 11:45am 29th March
imaginecodeR Karachi 12:00pm 29th March
IoT Solutions Karachi 12:15pm 29th March
ISU Robotics Karachi 12:30pm 29th March
LimeLite Karachi 12:45pm 29th March
MAAS Karachi 1:00pm 29th March
Mars games Karachi 1:15pm 29th March
ProRecruit Karachi 2:30pm 29th March
Psycaria Karachi 2:45pm 29th March
Reigning Tech(Order Booking System For Masses) Karachi 3:00pm 29th March
Softaych Karachi 3:15pm 29th March
Tabhouse Karachi 3:30pm 29th March
Team atmotech Karachi 3:45pm 29th March
TechyTeam Karachi 10:15am 30th March
charming Karachi 10:30am 30th March
DSU Game Developers Karachi 10:45am 30th March
Team-HYPER-REC Karachi 11:00am 30th March
Code Clone Finders Karachi 11:15am 30th March
Mern Blazers Karachi 11:30am 30th March
Team WSP Karachi 11:45am 30th March
TechUnion Karachi 12:00pm 30th March
Brain Busters Karachi 12:15pm 30th March
Devil's Whisper Karachi 12:30pm 30th March
Logical Processors Karachi 2:00pm 30th March
NeuroSquad Karachi 2:15pm 30th March
Pak Agile Karachi 2:30pm 30th March
Team Ninja Karachi 2:45pm 30th March
TechTor Karachi 3:00pm 30th March
Fast track Karachi 3:15pm 30th March
Humachines Karachi 3:30pm 30th March

 

Lahore:

Team Name Regional Final Region Timings Date
HNS Lahore 11:00am 29th March
Syed Chaos Lahore 11:15am 29th March
The MRI Lahore 11:30am 29th March
Mehar's... Lahore 11:45am 29th March
Queen Bees Lahore 12:00pm 29th March
Star Girls Lahore 12:15pm 29th March
The G Power Lahore 12:30pm 29th March
Camelion Lahore 12:45pm 29th March
Cyber Bullies Lahore 1:00pm 29th March
cybertise Lahore 1:15pm 29th March
Game Developers Lahore 2:30pm 29th March
Garrisonian Lahore 2:45pm 29th March
H & N Lahore 3:00pm 29th March
hotel LGU Lahore 3:15pm 29th March
LGU Lahore 3:30pm 29th March
Lions Heart Lahore 3:45pm 29th March
Power Puff Lahore 4:00pm 29th March
SH Lahore 11:00am 30th March
Spectrum Finders Lahore 11:15am 30th March
Team Farhan Lahore 11:30am 30th March
Team Girls Lahore 11:45am 30th March
TMK Lahore 12:00pm 30th March
Team MNS-UAM Lahore 12:15pm 30th March
Team zee Lahore 12:30pm 30th March
SPARTANS_NTU Lahore 2:00pm 30th March
Tech ninjas Lahore 2:15pm 30th March
The Amigos Lahore 2:30pm 30th March
Gameplay2050 Lahore 2:45pm 30th March
CESTINO - Smart Waste Managment Lahore 3:00pm 30th March
Chaser Express Lahore 3:15pm 30th March
IT Bugs Lahore 3:30pm 30th March
ITI Solutions Lahore 3:45pm 30th March

 

Peshawar:

Team Name Regional Final Region Timings Date
Addonexus Peshawar 11:00am 2nd April
XTECH 10 Peshawar 11:15am 2nd April
Code 4 life Peshawar 11:30am 2nd April
Cusit Data Warriors Peshawar 11:45am 2nd April
IntrecX Peshawar 12:00pm 2nd April
ALI AZIZ Peshawar 12:15pm 2nd April
Civil Rocks 3 Peshawar 12:30pm 2nd April
Fe Amaan Peshawar 12:45pm 2nd April
Lublin Pakistan Peshawar 1:00pm 2nd April
SAINT-NUST Peshawar 11:00am 3rd April
techwork Peshawar 11:15am 3rd April
Usman nazir Peshawar 11:30am 3rd April
CodeDetectives Peshawar 11:45am 3rd April
alpha 10 Peshawar 12:00pm 3rd April
Wec Snake team Peshawar 12:15pm 3rd April

 

Quetta:

Team Name Regional Final Region Timings Date
Abdullah Sabir Quetta 11:00am 2nd April
BUITEMS Computer Engineers Quetta 11:15am 2nd April
ChildBook Quetta 11:30am 2nd April
computer engineer Quetta 11:45am 2nd April
CONE Quetta 12:00pm 2nd April
crime maculation team Quetta 12:15pm 2nd April
Genymotion Quetta 12:30pm 2nd April
Markhors Quetta 12:45pm 2nd April
Nerd Herd Quetta 1:00pm 2nd April
Project cars Quetta 11:00am 3rd April
Team ASB Quetta 11:15am 3rd April
Zalmis Quetta 11:30am 3rd April
Tehreem shfiq ,kinza ishfaq ,laraib ali Quetta 11:45am 3rd April

 


[Skype for Business for iOS/Android] –履歴が重複表示される事象について

$
0
0

こんばんは。 Japan Skype for Business Support Team です。

Skype for Business for iOS/Android のモバイルクライアントにおいて、不在着信履歴や通話履歴が重複して表示されるケースがあります。 これは、クライアント自身がコールを受信した際にローカルで記録される履歴と、Exchange のメールボックスに配信された通知メールを、EWS で取得した履歴が、一致したコールであると判断できずマージ処理されないために発生します。 発生する環境やシナリオとして、現在以下の2種類が確認されておりますが、現バージョンにおいては実装上の動作となります。

  • PC クライアントにより不在着信履歴が保存されるケース
    PC クライアントとモバイルクライアントの両方に、同時にサインインしている場合に発生します。
    Exchange 連携が有効で Unified Messaging が無効な場合、PC クライアントが不在着信履歴メールを保存します。
    モバイルクライアントはこの不在着信履歴メールを EWS で取得し、ローカルに残された履歴とは別に表示します。
    _
  • SSCH (Server Side Conversation History) で通話履歴が保存されるケース
    SSCH は Skype for Business Server 2015 と Skype for Business Online で追加された機能となります。
    モバイルクライアントを制御する UCWA サービスが、通話履歴の情報をユーザーのメールボックスに配信します。 (UCWA)
    モバイルクライアントはこの不在着信履歴メールを EWS で取得し、ローカルに残された履歴とは別に表示します。
    _

< 不在着信履歴の例 >
_

< SSCH 通話履歴の例 >

・ほぼ同時に履歴が残されるため時刻が一致します
・同じ番号が表示されます
・SSCH による配信が遅れるため数分程度時刻が異なります
・PSTN GW の設定に依存して SIP ドメインが付加されます

_

Unified Messaging を有効にすることで、不在着信通知メールが Exchange Server / Exchange Online から配信される場合は EWS で取得しないため重複表示とはなりません。 また、オンプレミスの Skype for Business Server 2015 では SSCH を配信しないことも可能なため、重複表示に対する対策にはなります。 なお、Skype for Business Online では SSCH を現状無効化することはできません。

_

免責事項:
本情報の内容 (添付文書、リンク先などを含む) は、作成日時点でのものであり、予告なく変更される場合があります。

Tuesday Featured Post: Hesitating! Don’t Be Afraid to Ask Questions

$
0
0

"He who asks a question remains a fool for five minutes. He who does not ask remains a fool forever."

Good Day All!

We are back with the Tuesday’s Featured Post where we discuss a Forum Post or a Forum Thread from the MSDN or TechNet Forums and then highlight the value that they added.

Among the various interesting posts in the forum this is my pick, About Unit Testing asked by Sakura Data from SQL Server Forum. In this post the Original Poster is like to perform Unit Testing in SQL Server Platform and wanted to know what tools are needed to get start.

What grabbed my attention is that there are lot of people in the community who feels shy to asking simple question. They think that this will look bad and make them dumb in the community. I think this forum post is an inspiration for the silent majority in the community to raise their voice and break the myth.

This thread is answered by the Visakh16, who answered the question gracefully. Visakh16 pointed to an article which describes the unit testing inside database project with step by step explanation.

Sometimes community members lacked the courage to ask a simple question and this forum post is a good example for the silent majority audience in the community to break the myth. Remember, asking dumb questions allows you to develop courage. Courage is the ability to do something that scares you. Like facing most fears, the more we face them, the smaller they become.

"You never know the truth. You know 'a' truth."

With this forum post we can see that whether the question is simple or not we the community members are always here to help each other to find a solution.

Thank You

-Ninja Sabah

Use Microsoft Forms with your favourite and familiar apps

$
0
0

Microsoft Forms as a relatively new app within Office 365 has undergone rapid developments so that it has now become a firm favourite as both a classroom and admin tool for Educators using Office 365. Did you know that it's now even easier to use Forms with your colleagues and students seamlessly? Through integration with the Office family, Microsoft Forms can easily collect information from your favourite and familiar apps. Check out the information below from the Forms Team. 

New Banner -1.png


Forms for Excel

Forms for Excel, powered by Microsoft Forms, has replaced Excel Survey and builds a live data connection between Microsoft Forms and Excel. The responses you collect in your form will show up, real time, in your Excel workbook.

Excel-3.gif

 


Forms in Microsoft Teams

You can now access Microsoft Forms directly in Microsoft Teams. Set up a Forms tab to create a new form or insert an existing one, create notifications for your form via connector, or conduct a quick poll using Forms Bot.

Teams-1.gif

 


Forms web part for SharePoint

SharePoint has been widely used to share ideas and collect feedback. You can now use a Microsoft Forms web part on your SharePoint pages to collect responses or show survey results right on your site.

SP-2.gif


Find your group forms in portal

The forms you have created in Microsoft Teams or SharePoint team sites belongs to the O365 group. In Forms portal, there is a new features, "Recent group form", where you could quickly access the group forms you have used recently.

Group form.png


Integrating Microsoft Forms into PowerPoint (under development)

Microsoft Forms' new integration with Microsoft PowerPoint will allow a teacher to easily insert a quiz to a PowerPoint deck. Click the Forms icon in PowerPoint ribbon, the list of forms will be showed in the task pane. You can select a pre-created form and embed it to the current slide. Students who view this presentation can fill the form and submit without leaving PowerPoint.

ppt large.png

PPT2.png

Forms integration in PowerPoint is currently being developed and will be available to desktop users of PowerPoint in a few months.

The following content has been repurposed from the Forms blog site, check it out here. 


Interested in using Microsoft Forms and want to know how? Complete this course on the Microsoft Educator Community to get started.

Upgrade of SSRS from SQL 2008 R2 to SQL 2012

$
0
0

Yesterday, I encountered a weird scenario where in the SSRS component was failing to upgrade from SQL 2008 R2 to SQL 2012 where in the SQL database engine and all the other components succeeded.

SSRS component upgrade was failing with the below error :

TITLE: Microsoft SQL Server 2012 Setup
------------------------------

The following error has occurred:

A Secure Sockets Layer (SSL) certificate is not configured on the Web site.

 

------------------------------

On checking the Summary logs for the upgrade found that below logs for the SSRS component :

Feature: Reporting Services - Native
Status: Failed: see logs for details
Reason for failure: An error occurred during the setup process of the feature.
Next Step: The upgrade process for SQL Server failed. Use the following information to resolve the error, and then repair your installation by using this command line: setup /action=repair /instancename=MSSQLSERVER
Component name: SQL Server Reporting Services
Component error code: 0x80131500
Error description: A Secure Sockets Layer (SSL) certificate is not configured on the Web site.

As suggested, I tried to repair the SQL server instance and even the repair failed with the below error :

TITLE: Microsoft SQL Server 2012 Setup
------------------------------

The following error has occurred:

The Report Server WMI provider cannot create the virtual directory. This error occurs when you call SetVirtualDirectory and the UrlString is already reserved. To continue, clear all URL reservations by calling RemoveURL and then try again.

 

------------------------------

Now this was a more explanatory error which indicated, that there were certain UrlString already reserved before the SetVirtualDirectory function call occurs.

Thus opened the command prompt with admin privilege and ran the command netsh http show urlacl to list the Reserved URLs.

With the reserved URLs, I was able to remove all those related to the Reports and ReportServer using the commands netsh http delete urlacl url=https://....../Reports/ and netsh http delete urlacl url=https://....../ReportServer/.

Post which I listed all the remaining URLs using the same command netsh http show urlacl and it did not list any URLs related to the Reports and Report Server.

Now when we tried to repair the SSRS component, we succeeded with the repair and eventually the component was upgraded to SQL server 2012 build.

 

Hope this helps !! Happy Reporting !!

VSTS/TFS Continuous Deployment to App Service Environment (ASE) after Disabling TLS 1.0

$
0
0

If you are a regular reader of my blog, you will have noticed that I have been spending some time working with Azure App Service Environment (ASE). It is available in Azure Government and should be the technology of choice for Government Web Apps. In a previous blog post, I have described how to do CI/CD with ASE, but I have also described and recommended that you disable TLS 1.0 for your ASE. If you have tried to do a Web App Deployment from Visual Studio Team Services (VSTS) or Team Foundation Server (TFS) into an ASE after disabling TLS 1.0, you may have noticed that it fails. The problem is that MSDeploy (running on your build agent) will try to use TLS 1.0 to deploy your application and it will fail. In this blog, I will describe this problem so that you can recognize it and I will also show you how to fix it.

If you deploy an ASE in to a virtual network along with a build agent, you can use that build agent from VSTS or TFS to deploy into a Web App in an ASE. The local build agent is needed since the ASE cannot be seen from the hosted build agents in VSTS. The configuration is described here and it would look something like this:

The JumpBox in the diagram above is only needed to test the setup if you have no other VMs or on-premises machines with access to the virtual network (through VPN or Express Route). If you try to use the agent to deploy without making any modifications to it, you will get an error that looks something like this:

 

 

The specific error text is repeated here:

2018-03-23T17:39:21.4813236Z [command]"C:Program FilesIISMicrosoft Web Deploy V3msdeploy.exe" -verb:sync -source:package='C:agent_workr1adotnetcore-example-ASP.NET Core-CIdrops.zip' -dest:contentPath='ase-site',ComputerName='https://ase-site.scm.cloudynerd.us:443/msdeploy.axd?site=ase-site',UserName='$ase-site',Password='********',AuthType='Basic' -enableRule:AppOffline -enableRule:DoNotDeleteRule -userAgent:VSTS_94a19df8-3720-4cbc-8661-facce05aa290_release_1_1_1_1
2018-03-23T17:39:21.9926944Z Info: Using ID 'e97b4322-ee2a-4c6c-9777-04582963a0fe' for connections to the remote server.
2018-03-23T17:39:23.2433126Z ##[error]Failed to deploy web package to App Service.
2018-03-23T17:39:23.2435199Z ##[error]Error: Could not complete the request to remote agent URL 'https://ase-site.scm.cloudynerd.us/msdeploy.axd?site=ase-site'.
Error: The underlying connection was closed: An unexpected error occurred on a send.
Error: Unable to read data from the transport connection: An existing connection was forcibly closed by the remote host.
Error: An existing connection was forcibly closed by the remote host
Error count: 1.

The problem is, as indicated above, that msdeploy.exe is trying to use TLS 1.0. You can fix that by forcing the .NET Framework used by msdeploy.exe to use the "Strong Crypto" option. To do this, create a file, e.g. called strong-crypto.reg with the following content:

Windows Registry Editor Version 5.00

[HKEY_LOCAL_MACHINESOFTWAREMicrosoft.NETFrameworkv4.0.30319]
"SchUseStrongCrypto"=dword:00000001

[HKEY_LOCAL_MACHINESOFTWAREWow6432NodeMicrosoft.NETFrameworkv4.0.30319]
"SchUseStrongCrypto"=dword:00000001

Then right-click the file and choose "merge". This modify the registry accordingly. If you then repeat the deployment, you should see a successful completion:

I have published a template for DevOps with ASE, which includes a build agent template. Since the ASE deployed in this scenario has TLS 1.0 disabled, I have modified the configuration of the build agent such that the registry edits are made automatically. This is accomplished in the ConfigureASEBuildAgent.ps1 script. Specifically the lines:

        Registry StrongCrypto1
        {
            Ensure      = "Present"
            Key         = "HKEY_LOCAL_MACHINESOFTWAREMicrosoft.NETFrameworkv4.0.30319"
            ValueName   = "SchUseStrongCrypto"
            ValueType   = "Dword"
            ValueData   = "00000001"
        }

        Registry StrongCrypto2
        {
            Ensure      = "Present"
            Key         = "HKEY_LOCAL_MACHINESOFTWAREWow6432NodeMicrosoft.NETFrameworkv4.0.30319"
            ValueName   = "SchUseStrongCrypto"
            ValueType   = "Dword"
            ValueData   = "00000001"
        }

In conclusion, disabling TLS 1.0 on an ASE will cause automated deployments with msbuild.exe to fail. The solution is to enforce the "Strong Crypto" option and we can achieve that with some registry edits. Let me know if you have questions/comments/suggestions.

 

Viewing all 35736 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>