Quantcast
Channel: MSDN Blogs
Viewing all 35736 articles
Browse latest View live

Visual Studio Code C/C++ extension March 2018 update

$
0
0

Today we are excited to announce the March 2018 update to the Visual Studio Code C/C++ extension! This update includes improved auto-complete for local and global scopes and a simplified configuration process for system includes and defines, enabling a better out-of-box IntelliSense experience. You can find the full list of changes in the release notes.

We’d like to thank everyone who tried our Insiders builds earlier this month and sent us feedback! Fixes for the issues you reported and feature suggestions you told us about helped shape the final release the way it is today 😊. If you are not yet an insider but are interested, we’d love for you to join the VS Code C/C++ Insiders program.

Auto-complete for local and global scopes

While this feature is not completely new, IntelliSense now provides a semantic-aware list of auto-complete suggestions when you type in local and global variables and functions. Compared with the previous approach, the new auto-complete experience gives you a shorter and more relevant list of suggestions, making it easier to write C/C++ code.

System includes and defines automatically retrieved from the compiler

IntelliSense now automatically retrieves system includes and defines from GCC/Clang-based compilers, eliminating the need for manual configuration in the “includePath” and “defines” settings. On Mac and Linux, the IntelliSense engine automatically selects a compiler as the default by searching for the installed compilers on the system. You can check which compiler is being used in the new “compilerPath” setting in the c_cpp_properties.json file, and change the value if needed. The “compilerPath” setting also accepts compiler arguments that affect the system defines returned.

In addition, the new “cStandard” and “cppStandard” settings allow setting language standards explicitly for IntelliSense.

Force IntelliSense to process arbitrary headers

If you wish IntelliSense to process headers that are not explicitly listed in #include statements, you can now use the new “forcedInclude” setting to specify them. The IntelliSense engine will first process these headers before it looks at #includes.

Tell us what you think

Download the C/C++ extension for Visual Studio Code, try it out, and let us know what you think. File issues and suggestions on GitHub. If you haven’t already provided us feedback, please take this quick survey to help shape this extension for your needs. You can also find us on Twitter (@VisualC).


Upload image

Why a developer should build a solution with microservices

$
0
0

Hi all!
Good day!

Today, I am continuing with our blog post series on microservices from the perspective of a developer. This is our second blog post of the series. I suggest you to read first blog post of the series Introduction to microservices.

This series is completely based in views of a developer and I will cover theory and practical from start to end. In this blog post I will discuss as developer why we should choose/adapt microservices.

Andreas Helland is running a separate series -Building Microservices with AKS and VSTS that you would like to read as well.

Before as an organization you can decide that you want to go ahead with microservices, you must know all its benefits and the flip side of the coin too. Let's see what are the benefits of microservices first. As we discussed some attributes of the microservices in the last article the most important ones are

  1. Isolated/Single Functionality.
  2. Isolated Data & State.

As you move ahead you will see how this is done. Meanwhile let's see what are the advantages you derive out of these. You will analyze the advantages from the perspective of the team involved in the actual development of the code and also for the sales team/strategy team as well.

Dev Team Advantages

Once Isolated/Single Functionality has been achieved - it means that now your code is available to you in a much more smaller unit to handle without any inter-dependency on other modules. This allows multiple teams to work on the existing code-base without having to rely upon each other for :

  1. Knowledge Transfer
  2. Code/Components
  3. Deployment
  4. Technology Choice
  5. Scaling

This happens because with the code being independent of other modules/parts of the application - we don't have to worry about breaking/testing other parts of the code with whatever changes/additions we make to our code base. This is achieved not just with the code separation or isolation, but also combining it with Data Isolation for your part of the code [Microservice]. If the data that we are dealing with in our code is directly accessible to other applications or modules then you can never rely upon its state or availability.

We can work out the final setup of the Microservices based architecture such that we can make changes to our code, test and deploy it with worrying about breaking our existing services in action. This is achieved by releasing another version of your Microservice. So you can have different versions of your Microservice co-exist without any problem. The reason of having different versions could be other this also. There might be a need that the same functionality is delivered to other consumer with slight modifications based upon the business needs.

Business Advantages

When we design our product - aspects like stability, efficiency and features are the points that we design our product well around. Based upon the prevalent business needs at the time of designing of the software at points are considered. However there is always a chance that while we were still finishing our product another company might jump into the picture and provide more features in the same price. This would mean that effectively our product's business advantage might get reduced/completely lost altogether. After all this is business and no body is required to be nice to you with cut throat competition in the market.

So what is the strategy to not only minimize the losses but gain back the foothold. This is where business advantage of Microservices comes into picture. With this architecture it is easier for us to add new features without breaking existing functionality and with much lower cost. Here more from the cost the advantage of having the capability to swiftly respond to the business needs and minimum go to market time - business stands to gain.

In such scenarios for business it could be question of Do or Die situation. And the DEV team is the only savior they have at such times. This is one of the reasons that we mentioned in the last post that Microservices is a company wide culture that is adopted and not just another practice.

I am not sure how it was for you, but for me in the initial 15 years of IT industries, Sales and Dev Teams were usually never on the same page and always at logger heads. For sales team - the client opportunity was not be lost at any cost, where as for the Dev team committing to deadlines for code that can't be delivered was more important. This can be changed easily now.

Other Advantages

In my opinion simply speaking the advantages discussed above summarize it. However as the life cycle of a product progresses there are two areas where Microservices provide even more benefit. Let's take a look at them.

Scalability

Once an Enterprise level product has been deployed and made available to the online community then subject to few limiting factors, there is a good probability that user base might increase. If the increase is even modest and not substantial, then over the years it might be required to handle not just an increased concurrent user base but also increased data volumes.

Microservices allow you to easily identify and monitor areas that are under stress due to data handling or increased user base. And then with equal ease we could scale up just those Microservices where the stress is identified with out scaling up rest of the parts of the system/application. In Monolithic architecture this is not a luxury that we had. We would in those scenarios end up scaling the entire application - all modules.

This would mean - that you are effectively investing time and money in scaling up the modules/parts - that don't need it or won't benefit from it. Not only is this cost disadvantage but also time disadvantage.

The cost-benefit ratio would be very impressive in the case of a Microservice being scaled up vis-a-vis Monolithic application.

Technology Adoption

Even if you continue to work in the same language for your entire life-time, you always would have new compiler/language specifications/features being released. There would times that backward compatibility is broken. Or there would be times when there is absolutely new technology disrupting the market - that can make life easier for devs, faster goto market for business and better performance for the work at hand.

So in such scenarios you might consider evaluating the technology even if not adopting it eventually. However with Monolithic architecture even evaluation could be prohibitory. There is a probability that you want to take gradual steps in adopting the technology while the community around it is still building up. And as the technology is confirmed stable by the industry as a whole - you would take the plunge.

With Microservices we could very easily pick single out the part of the whole eco-system to be tested with the new technology and implement it without affecting rest of the system. The best part is that while we are still having the new version of the service in beta stage - the system could be made to support both versions of the service - effectively protecting the system from breaking down, in case our technology adoption experiment was to go wrong due to any scenario.

It cannot get easier than this when it comes to adopt a new technology and even go-to production testing with it. Because rest assured you know that the eco-system would not be affected even in the worst case scenario. And we could very well perform this with more than service at the same time.

Summary

In this blog post I tried to layout various advantages that microservices provide for the Developers and the Business sponsors alike. Also tried to showcase how technology adoption could be easily done with it.

In the coming posts I will start crafting how to actually pick up an existing monolithic application and convert it into microservice based architecture.

Using the first job run to optimize subsequent runs with Azure Data Lake job AU analyzer

$
0
0

Customers have been telling us it's hard to find the right balance between Analytics Units (AUs) and job execution time. For too many users, optimizing a job means running the job in the cloud using trial and error to find the best AU allocation that balances cost and performance. Today we are happy to introduce the AU Analysis feature to help you discover the best AU allocation strategy for a specific job. You can find this new feature by opening the job view and clicking on the AU Analysis tab in the Azure Portal or in the latest Visual Studio tools.

Find the best AU allocation for your job

A Data Lake job is billed by the number of AU-hours it consumes (specified AUs multiplied by job execution time). You may want to optimize for cost (reducing AU waste) or achieve a certain SLA in response time (e.g. hourly job should finish after 30 minutes and not take longer than 1 hour).

How can you know how long it will take a job to run at different levels of AU allocation without running it many times in the cloud? The analyzer takes the execution information gathered while the job ran and simulates how different AU allocations might impact the run time of the job.

On the AU Analysis tab, you can see two AU allocation suggestions - 'Balanced' and 'Fast' that are based on your job’s particular characteristics (e.g. how parallelizable it is, how much data it is processing, etc.). If you are looking for the most cost-effective use of AUs, where you get the maximum improvement in execution time for the least number of additional AU-hours, then 'Balanced' is the one you want. Remember that you may not always see a linear reduction in job execution time by increasing the number of AUs. On the other end of the spectrum, you may want to prioritize performance, in which case choose ‘Fast’ to see the corresponding AUs and job run time.

In the example job below, the job originally ran for 2.5 hours with 50 AUs allocated, while 127.1 AU-hours were used. If we choose the 'Balanced' option, you will see that the job can run 8 times faster than the original round (2.5 hours to 18.7 minutes) even with 9% percentage reduction of AU-hours (127.1 to 115.2). In the chart you can also find the simulated AU usage over time in this option.

 

 

If you want to know the best performance with a reasonable efficiency, then the ‘Fast’ option will serve you well. It shows how many AUs are needed for the best performance. As you can see in the screenshot below, allocating 1765 AUs causes the job to finish in only 11.8 minutes.

 

 

If you want to achieve a certain SLA in response time or explore other AU allocations, the AU analyzer also enables you to explore the AU hours versus job runtime by using the custom slider bar to dynamically see how the charts and numbers changes for different settings.

In the screenshot below, by selecting Custom card and moving the slider bar, we can see that you need to allocate 81 AUs to have the job finished in 1 hour (costing 83.4 AU-hours).

 

How accurate is the analyzer?

Basically, the analyzer uses a local job scheduler to simulate actual execution. It relates very closely to the job scheduler running in the cluster, except that it does not take into consideration the dynamic situations (network latencies, hardware failure, etc.) which may cause vertex rerun or slowness. To learn how those dynamic factors affect the accuracy of the analyzer, we tested it on several hundred production jobs and achieved correct estimates in better than 99% of the cases.

 

Try this feature today!

Give this feature a try and let us know your feedback in the comments.

Interested in any other samples, features, or improvements? Let us know and vote for them on our UserVoice page!

 

Learn more

For more detailed introduction about AU usage and job cost you can refer to the two blogs below.

Understanding the Data Lake Analytics Unit

How to Save Money and Control Costs with Azure Data Lake Analytics

TreeView を操作していると、StackOverflowException 例外で WPF アプリケーションが強制終了します

$
0
0

こんにちは、Platform SDK (Windows SDK) サポートチームです。
今回は、TreeView を操作している時に、WPF アプリケーションが StackOverflowException 例外で強制終了する問題についてご案内します。

 

現象

TreeView を実装した WPF アプリケーションにおいて、スクロールバーから TreeView をスクロールさせる、TreeView のクライアント領域をマウスでクリックする等の操作を行うと、StackOverflowException 例外でプロセスが強制終了することがあります。
本現象は、.NET Framework のバージョン 4.7 および 4.7.1 でのみ確認されています。
 

原因

本現象は、.NET Framework 4.7 ならびに 4.7.1 の TreeView 内部で使用している、VirtualizingStackPanel の仮想化処理の不具合が原因で発生しています。
 

回避策

以下のいずれかの回避策をご検討ください。

  • 回避策 (1)
    TreeView の VirtualizingStackPanel.IsVirtualizing プロパティを False に設定し、仮想化オフの状態で TreeView を使用するようにします。
  • 回避策 (2)
    TreeView の IsVirtualizingStackPanel_45Compatible というプロパティを、<appSettings> から True に設定します。
    この設定変更により、TreeView 内部の VirtualizingStackPanel の仮想化処理が、.NET Framework 4.5 互換で動くようになります。

<add key="IsVirtualizingStackPanel_45Compatible" value="true"/>

 

注意

回避策 (2) を採用した場合、TreeView 内部の VirtualizingStackPanel の仮想化処理は .NET Framework 4.5 互換で動くようになります。
この場合、.NET Framework 4.6 以上で VirtualilzationStackPanel に対して行われた不具合修正や品質向上のコードも動かなくなります。
そのため、以下の様な .NET Framework 4.5 固有の現象が新たに発生するようになる可能性があります。

  •  TreeView を高速にスクロールした場合に、表示される TreeView アイテムの内容が不正になる (表示されるべき TreeView アイテムが表示されない等)

 

状況

マイクロソフトでは、この現象について調査しています。
進展があり次第、本ブログを更新予定です。

New to Office 365? Get trained on the MEC

$
0
0

If you're new to Office 365 then we have the perfect course on the Microsoft Educator Community for you and your staff! This quick and simple course is a fantastic introduction to Office 365 online and gives practical tips and tricks for the use of Office 365 on any device.

 

 

 


Streamline efficiency with Office 365 apps

Description:
Office 365 provides the right environment for better learning outcomes. In this introduction to Office 365, educators will learn how to become more innovative with cloud-based tools, regardless of the device they use. This course is aimed at educators for whom Office 365 is relatively new and who are looking to implement solutions to classroom problems right away. With Office 365, educators will learn how to become more innovative with cloud-based tools, regardless of the device they use.

 

 

 


 

Learning Objectives:

  • Learn how to create, share and sync a document on OneDrive for Business, and use other OneDrive & Office Online features.
  • Explore key features in Office 365
  • Discover how Office 365 apps can support student learning

 


Not already a member of our Microsoft Educator Community? Join now and find a whole host of FREE CPD courses which can help you navigate the latest and greatest in Windows 10 and Office 365 with ease. Earn just 1,000 points (the equivalent of two introductory courses) to be recognised as a Certified MIE for your achievements.

Správa Azure subscription patřící Microsoft Accountu z Organizational Accountu (AAD)

$
0
0

Můžete se stejně jako já ocitnout v situaci, kdy

  • máte existující Azure Subscription patřící pod Microsoft Account,
  • chcete ji spravovat pomocí svého Organizational Accountu (Azure AD),
  • a nechcete nebo nemůžete z jakéhokoliv důvodu převést vlastníka subscription na Organizational Account – např. pokud se jedná o sponzorovanou subscription, kde je sponzorství asociované na určitý Microsoft Account (Microsoft Partner Network, MSDN Subscription, MVP Sponsorship, atp.).

Celý trik je prostý „change the directory of the subscription to your Azure AD directory“. Toto uspořádání vám ponechá vlastnictví subscription na Microsoft Accountu (Account Admin), přesto budete moci subscription na portále spravovat při přihlášení přes svůj Organizational Account.

Prakticky je to jen pár kroků:

1. Přidejte Microsoft Account do svého Azure AD jako „guest user“

Aby bylo možné změnit directory oné subscription, musí být příslušný Microsoft Account členem cílového AAD. Asociaci MSA k AAD provedete jednoduše:

  • Přihlašte se do Azure Portálu jako Azure AD administrator cílového AAD.
  • Otevřete si Azure Active Directory blade.
  • Jděte do sekce Users.
  • Zvolte tlačítko „+ New guest user“ v horní liště.
  • Pozvěte svůj Microsoft Account do cílové Azure Active Directory.

2018-03-26_9-50-53

2. Přijměte pozvánku Microsoft Accountu do AAD

Pozvánku je potřeba přijmout…

  • Dostanete do e-mailové schránky Microsoft Accountu zprávu, v které je tlačítko pro přijetí pozvánky.
  • Raději neklikejte na tlačítko Accept Invitation přímo, protože váš browser může být přihlášený na Organizational Account (nebo se jinak chytne na buchvíjaký účet).
  • Raději naberte cílové URL tlačítka do schránky a otevřete ho v privátním okně browseru (New incognito window, popř. In-private, či jak se to v různých browserech jmenuje).
  • Přihlašte se pomocí Microsoft Accountu.
  • Po přihlášení a akceptaci pozvánky budete nejspíš přesměrováni na matoucí (dost často prázdnou) stránku Applications. Nicméně asociace je vytvořena a můžete browser zavřít.

3. Změna directory subscription

Nyní můžete změnit directory subscription na cílové AAD:

  • Přihlašte se do Azure Portalu svým Microsoft Accountem.
  • Otevřete si příslušnou subscription (můžete využít např. vyhledávací pole nahoře a napsat „subscription“).
  • Klikněte na tlačítko Change directory v horní liště tlačítek.
  • Na panelu Change the directory byste nyní měli mít možnost vybrat svou Azure AD jako cílovou directory pro změnu.
  • Potvrďte změnu.

Změna se obvykle projeví během 10 minut.

2018-03-26_9-49-25

4. Přidání oprávnění pro váš Organizational Account

Abyste mohli svůj Organizational Account použít ke správě subscription, je potřeba mu přidat příslušná oprávnění (dokud jste přihlášeni pomocí původního Microsoft Accountu).

  • V blade Subscription se přepněte do sekce Access control (AIM).
  • Přidejte svému Organizational Accountu roli Owner na úrovni subscription.
  • V kontextovém menu (pravé tlačítko myši) na přidaném Organizational Accountu můžete též zvolit Add as co-administrator. Některé okrajové starší scénáře ještě spoléhají na co-administrátorské oprávnění a nestačí jim role Owner..

2018-03-26_9-22-25

5. Hotovo

  • Odhlaste se z Microsoft Accountu, přihlašte se Organizational Accountem a měli byste mít možnost subscription v portále spravovat.

 

Rober Haken, HAVIT Knowledge Place

Skype for Business Server 2015 2018 年 3 月の更新プログラムがリリースされています。

$
0
0

こんにちは。Japan Lync/Skype サポート チームです。

Skype for Business Server 2015 2018 年 3 月の更新プログラムがリリースされています。

Updates for Skype for Business Server 2015
https://support.microsoft.com/en-us/kb/3061064

多数の修正や機能強化が含まれておりますので、是非適用して快適な Lync/Skype ライフをお楽しみください。

本情報の内容 (添付文書、リンク先などを含む) は、作成日時点でのものであり、予告なく変更される場合があります。


Nullable types arithmetic and null-coalescing operator precedence

$
0
0

Here is a simple question for you: which version of a GetHashCode() is correct and what the performance impact incorrect version would have?

public struct Struct1
{
   
public int N { get
; }
   
public string S { get
; }
   
public Struct1(int n, string s = null) { N = n; S =
s; }

   
public override int GetHashCode() =>
 
        N
^
 
        S
?.GetHashCode() ?? 0
;

   
public override bool Equals(object obj) =>
 
        obj
is Struct1 other && N == other.N && string.Equals(S, other.
S);
}

public struct Struct2
{
   
public int N { get
; }
   
public string S { get
; }
   
public Struct2(int n, string s = null) { N = n; S =
s; }

   
public override int GetHashCode() =>
 
        S
?.GetHashCode() ?? 0 ^
        N;

   
public override bool Equals(object obj) =>
 
        obj
is Struct1 other && N == other.N && string.Equals(S, other.S);
}

The structs are not perfect (they don't implement IEquatable<T>) but this is not the point. The only difference between the two is the GetHashCode() implementation:

// Struct 1
public override int GetHashCode() =>
    N ^
    S?.GetHashCode() ?? 0
;

// Struct 2
public override int GetHashCode() =>
    S?.GetHashCode() ?? 0 ^
    N;

Let's check the behavior using the following simple benchmark:

private const int count = 10000;
private static Struct1[] _arrayStruct1 =
    Enumerable.Range(1, count).Select(n => new Struct1(n)).ToArray();
private static Struct2[] _arrayStruct2 =
    Enumerable.Range(1, count).Select(n => new Struct2(n)).
ToArray();

[
Benchmark]
public int Struct1() => new HashSet<Struct1>(_arrayStruct1).
Count;

[
Benchmark]
public int Struct2() => new HashSet<Struct2>(_arrayStruct2).Count;

The results are:

Method | Mean | Error | StdDev | -------- |-------------:|-------------:|-------------:| Struct1 | 736,298.4 us | 4,224.637 us | 3,745.030 us | Struct2 | 353.8 us | 2.382 us | 1.989 us |

Wow! The Struct2 is 2000 times faster! This definitely means that the second implementation is correct and the first one is not! Right? Actually, not.

Both implementations are incorrect and just by an accident the second one "works better" in this particular case. Let's take closer look at the GetHashCode method for Struct1:

public override int GetHashCode() => N ^ S?.GetHashCode() ?? 0;

You may think that this statement is equivalent to N ^ (S?.GetHashCode() ?? 0)but it is actually equivalent to (N ^ S?.GetHashCode()) ?? 0:

public override int GetHashCode()
{
   
int? num = N ^ ((S != null) ? new int?(S.GetHashCode()) : null
);

   
if (num == null
)
       
return 0
;

   
return num.GetValueOrDefault();
}

Now it is way more obvious why the Struct1 is so slow: when S property is null(which is always the case in this example), the hash code is 0 regardless of the Nbecause N ^ (int?)null is null. And trying to add 10000 values with the same hash code effectively converts the hash set into a linked list drastically affecting the performance.

But the second implementation is also wrong:

public override int GetHashCode() => S?.GetHashCode() ?? 0 ^ N;

Is equivalent to:

public override int GetHashCode()
{
   
if (S == null
)
    {
       
return 0 ^
N;
    }

   
return S.GetHashCode();
}

In this particular case, this implementation gives us way better distribution, but just because the S is always null. In other scenarios, this hash function could be terrible and could give the same value for a large set of instances as well.

Conclusion

There are two reasons why the expression N ^ S?.GetHashCode()??0 gives us not what we could expect. C# supports the notion of lifted operators that allows mixing nullable and non-nullable values together in one expression: 42 ^ (int?)null is null. Second, the priority of null-coalescing operator (??) is lower than the priority of ^.

Operator precedence for some operators is so obvious that we can omit explicit parens around them. In case of the null-coalescing operator, the precedence could be tricky so use parenthesis to clarify your meaning.

Additional references

apps

$
0
0

尚、以下のサイトでもこちらでも一部ご紹介しておりますので、ご参照までに。

https://blogs.msdn.microsoft.com/shintak/2017/08/01/hololensapps2017h1/

 

お勧めHoloLens アプリ

 

HoloEngine:エンジンの稼働模型です

https://www.microsoft.com/ja-jp/store/p/holoengine/9nblggh4wkh9

AirCraft Explorer:約3mくらいの大きさで飛行機が表示されます。仲間で自由に見ることができます。

https://www.microsoft.com/ja-jp/store/p/aircraft-explorer/9nblggh4tnld

HoleLenz:壁に穴をあけて別空間をのぞけるアプリです

https://www.microsoft.com/ja-jp/store/p/holelenz/9p7mr081n921

 

Orinox Model Review Holographic:工場を再現します。

https://www.microsoft.com/ja-jp/store/p/orinox-model-review-holographic/9n55t22tlbwp

Robotics BIW:工場を再現します。シェアリングも対応。

https://www.microsoft.com/ja-jp/store/p/robotics-biw/9pjwkl0gcdx9

 

HoloLeart:精密な心臓のモデルです。拍動することもできます。

https://www.microsoft.com/ja-jp/store/p/holoheart/9nblggh4v0pz

HoloMaps by Taqtile3Dの地図を表示することができるアプリです。

https://www.microsoft.com/ja-jp/store/p/holomaps-by-taqtile/9nblggh3zrtd

HoloPlanner:家具の配置シミュレーションができるアプリです。

https://www.microsoft.com/ja-jp/store/p/holoplanner/9nblggh4spf1

HoloTire:タイヤの機能を紹介するアプリです。

https://www.microsoft.com/en-us/store/p/holotire/9nblggh53113

HoloAnatomy:人体解剖模型を表示するアプリです。

https://www.microsoft.com/en-us/store/p/holoanatomy/9nblggh4ntd3

 

HoloEngine:エンジンの稼働模型です

ms-windows-store://pdp/?ProductId=9nblggh4wkh9

AirCraft Explorer:約3mくらいの大きさで飛行機が表示されます。仲間で自由に見ることができます。

ms-windows-store://pdp/?ProductId=9nblggh4tnld

HoleLenz:壁に穴をあけて別空間をのぞけるアプリです

ms-windows-store://pdp/?ProductId=9p7mr081n921

Orinox Model Review Holographic:工場を再現します。

ms-windows-store://pdp/?ProductId=9n55t22tlbwp

Robotics BIW:工場を再現します。シェアリングも対応。

ms-windows-store://pdp/?ProductId=9pjwkl0gcdx9

HoloLeart:精密な心臓のモデルです。拍動することもできます。

ms-windows-store://pdp/?ProductId=9nblggh4v0pz

HoloMaps by Taqtile3Dの地図を表示することができるアプリです。

ms-windows-store://pdp/?ProductId=9nblggh3zrtd

HoloPlanner:家具の配置シミュレーションができるアプリです。

ms-windows-store://pdp/?ProductId=9nblggh4spf1

HoloTire:タイヤの機能を紹介するアプリです。

ms-windows-store://pdp/?ProductId=9nblggh53113

HoloAnatomy:人体解剖模型を表示するアプリです。

ms-windows-store://pdp/?ProductId=9nblggh4ntd3

Top stories from the VSTS community – 2018.03.30

$
0
0

Here are the top stories we found in our streams this week related to DevOps, VSTS, TFS and other interesting topics.

Top Stories

  • Use Cognitive Services in VSTS Custom Branch Policies - Yan Sklyarenko
    Following on from the popular post about using Twitter sentiment as a release gate, Yan published an interesting post showing how to use the Azure Cognitive Services API in a release gate to check the language of a pull request. While the example in question might not be what you need in your org, the post itself is worth bookmarking as a great how-to on release gates including keeping API secrets safe in Azure Key Vault.
  • Building Microservices with AKS and VSTS - Andreas Helland
    Andreas has started a new series on deploying to the Azure Container Service & Kubernetes using VSTS. Promises to be an excellent series starting from the very beginning - how to get your code into VSTS.
  • Deploying WordPress Application using Visual Studio Team Services and Azure (Part Two)Yaron Pri Gal
    Yaron follows up his post about deploying WordPress using VSTS. Not only is it an example of deploying WordPress sites, it's worth a read for a details guide on deploying a LAMP stack based site (including the MySql deployment)
  • #MyGitJourney - Mickey Gousset
    Git can be very intimidating to new users, but Mickey is doing a great job documenting his journey getting to grips with Git as a long time TFVC user. Worth reading along if you are also new to Git and want to learn along with Mickey about how to get on as a user of Git.
  • Global DevOps Bootcamp
    A regular plug that the Global DevOps Bootcamp will return on June 16. Last years event was fantastic and a great way to learn all about CI/CD with VSTS using a realistic set of scenarios. Places fill up fast so make sure you register to attend your local event asap, if you don't have one nearby why not host your own?

Podcasts & Videos

If you have feedback or would like to see a story included in next weeks round-up then please leave a comment below or use the #VSTS hashtag on Twitter

Windows Search のSQL文にSystem.ItemUrlを指定すると、検索処理に失敗する

$
0
0


こんにちは、Platform SDK (Windows SDK) サポートチームです。
アプリケーションから Windows Search 機能を利用してファイルの検索を行った場合、内部的なエラーにより検索結果が得られない現象についてご案内します。

現象
Windows 10 1703 (Creators Update) および Windows 10 1709 (Fall Creators Update) 上で OS の Windows Search 機能を利用してファイル検索を行い、結果としてファイルの URL (file:///c:/mydir/bar/hello.txt 等) を取得しようとすると、検索に失敗します。

たとえば、アプリケーションから以下のように SELECT 文に System.ItemUrl を指定した場合、hr = cCommand.Open(cSession, pszSQL); が DB_E_ERRORSOCCURRED エラーを返して失敗します。

#include <atldbcli.h>
CDataSource cDataSource;
hr = cDataSource.OpenFromInitializationString(L"provider=Search.CollatorDSO.1;EXTENDED PROPERTIES="Application=Windows"");

CString pszSQL = “SELECT System.ItemUrl FROM MachineName.SystemIndex WHERE SCOPE='file://MachineName/<path>'”

if (SUCCEEDED(hr)) {
     CSession cSession;
     hr = cSession.Open(cDataSource);

    if (SUCCEEDED(hr))
     {
         CCommand<CDynamicAccessor, CRowset> cCommand;
         hr = cCommand.Open(cSession, pszSQL);

        if (SUCCEEDED(hr))
         {
             for (hr = cCommand.MoveFirst(); S_OK == hr; hr = cCommand.MoveNext())
             {
                 for (DBORDINAL i = 1; i <= cCommand.GetColumnCount(); i++)
                 {
                     PCWSTR pszName = cCommand.GetColumnName(i);
                     // do something with the column here
                 }
             }
             cCommand.Close();
         }
     } }


原因
この問題は Windows Search SQL の SELECT 文に System.ItemUrl が指定されている場合、検索処理が DB_E_ERRORSOCCURREDエラーで失敗することが原因で発生します。これは内部的なロジックの問題によりエラーが発生しており、製品の不具合であることを確認しています。


回避策
SELECT 文に System.ItemUrl が指定されている場合、検索が失敗するので、この値を指定しないでください。検索結果としてファイルパス (c:/mydir/bar/hello.txt 等) が必要な場合は、System.ItemPathDisplay を代替として利用いただけます。URL として検索結果のパスを扱いたい場合は System.ItemPathDisplay の結果に独自に file スキーマを追加していただき、プログラム的に回避いただく方法があります。

状況

この問題は次期Windowsに修正が行われる予定です。


参考情報

Windows Search 機能と、その中の SQL クエリの利用方法に関する概要については、以下を参照ください。

Windows Search

https://msdn.microsoft.com/en-us/library/windows/desktop/ff628790(v=vs.85).aspx

Querying the Index with Windows Search SQL Syntax

https://msdn.microsoft.com/en-us/library/windows/desktop/bb231256(v=vs.85).aspx

また、System.ItemUrl や System.ItemPathDisplay など、紹介したシステム プロパティの詳細と Windows Search  との関係については、以下を参照ください。

Property System Overview

https://msdn.microsoft.com/en-us/library/windows/desktop/ff728871(v=vs.85).aspx

Windows Properties

https://msdn.microsoft.com/en-us/library/windows/desktop/dd561977(v=vs.85).aspx

Lesson Learned #36: Best Practices connecting and executing queries using Azure SQL Database

$
0
0

Very often, we received requests about what are the best practices stablishing the connection or executing queries using Azure SQL Database.

In these two videos in Spanish and English we are going to cover these best practices:

  • How the connection work inside / outside of Azure environment.
  • Main configuration points in our client server.
  • How to prevent/manage a transient connectivity issue using Retry-Logic policy.
  • Best Practices running a query.
  • A sample about how to capture a command timeout using Extended Events.
  • How to monitor a connection made by client application.

Enjoy!

Create/Update AD users and Group Membership

$
0
0

Many of the times, you need to create users, groups, (test)OU for quick setting up environment. You may or may not have a list of CSV for this purpose. However, the need to add certain users to be part of certain groups is based of of some logic. For example: HR, Marketing, Sales <-- they all represent sort of 'Department'. This also happens to be one of the attributes for an AD user account.

Attached script is an attempt to automate similar tasks. It pick certain attribute from AD user accounts and makes sure they are part of groups represented by certain attributes. Again, for example 'Department' attribute will display various departments for various users and they are likely to be in their respective department groups. It can pickup users from CSV file and check for all the users' attributes and check (or create) groups to add memberships to. If users are missing from AD, they will be created.

Below are some sample examples:

EXAMPLE: .createUsersAndGroups.ps1 -csvFilePath .listOfUsers.csv -Attribute surname 1. Above example will create users from listof users from the CSV file into the default UsersContainer since OU is not provided. 2. It will pick the 'surname' attribute from the user objects and group them with their surnames to create groups, if missing. 3. It will add/update respective users to their surname groups.

EXAMPLE .createUsersAndGroups.ps1 -csvFilePath .listOfUsers.csv -Attribute surname -OUName 'TestingOU' 1. Above example will check for OU existance 'TestingOU' under domain root, i.e., OU=TestingOu,DC=Contoso,DC=Local - and will create if missing. 2. It will create users from listof users from the CSV file into the newly created OU. 3. It will pick the 'surname' attribute from the user objects and group them with their surnames to create groups, if missing. 4. It will add/update respective users to their surname groups.

==> Here is sample output:
PS C:bridgecreateADUsersGroupsOU> .createUsersAndGroups.ps1 -csvFilePath .listOfUsers.csv -Attribute Company -Verbose
VERBOSE: 30-Mar-2018 || 16:45:44.086 :: .listOfUsers.csv successfully imported
VERBOSE: 30-Mar-2018 || 16:45:44.086 :: ************ Attempting create users from .listOfUsers.csv ************
VERBOSE: 30-Mar-2018 || 16:45:44.086 :: User johnd already exists - updated attributes as per CSV file.
VERBOSE: 30-Mar-2018 || 16:45:44.086 :: User test1 already exists - updated attributes as per CSV file.
VERBOSE: 30-Mar-2018 || 16:45:44.086 :: ************ Attempting to add users with attribute Company to Company groups
************
VERBOSE: 30-Mar-2018 || 16:45:44.086 :: The group myCompany F5 already exists
VERBOSE: 30-Mar-2018 || 16:45:44.086 :: Users with myCompany F5 Company updated to the group myCompany F5:
kt muskant vimalt manharlalt kirtibent parint suhanat poorvit maharshit manishm falgunim sarthakm dhwanim
VERBOSE: 30-Mar-2018 || 16:45:44.086 :: The group Fantasy Land Inc already exists
VERBOSE: 30-Mar-2018 || 16:45:44.086 :: Users with Fantasy Land Inc Company updated to the group Fantasy Land Inc:
Johnd
VERBOSE: 30-Mar-2018 || 16:45:44.086 :: The group Testing Mania already exists
VERBOSE: 30-Mar-2018 || 16:45:44.086 :: Users with Testing Mania Company updated to the group Testing Mania:
test1

The script can be downloaded from Scripting Gallery: Script

The Azure Government Cloud Solutions Provider (CSP) program keeps growing

$
0
0

Since we launched Azure Government in the Cloud Solution Provider Program (CSP) we have been working with Partners across our ecosystem to bring them the great benefits of this channel, enable them to re-sell Azure Government, and help them grow their business while providing the cloud services their customers need. Below you can find a list of all the authorized Cloud Solution Providers which can sell Azure Government. This list includes all approved Partners as of March 26th, 2018, and we will continue to provide updates as we as we onboard new partners.

Partner Name CSP Partner Type
1901 Group, LLC Direct Reseller
3Di inc Direct Reseller
AC4S Consulting, Inc. Direct Reseller
Accelera Solutions Inc Direct Reseller
Accenture Federal Services LLC Direct Reseller
Adoxio Business Solutions Limited Direct Reseller
Affigent Direct Reseller
Agile IT Direct Reseller
AIS Network Direct Reseller
Alexan Consulting Enterprise Services, LLC (ACES) Direct Reseller
Alliance Enterprises, Inc. Direct Reseller
Ambonare Direct Reseller
American Technology Services Direct Reseller
APEX TECHNOLOGY MANAGEMENT INC Direct Reseller
Applied Information Sciences, Inc. Direct Reseller
Approved Contact, LLC Direct Reseller
APTT US Gov Cloud Direct Reseller
ArcherPoint, Inc. Direct Reseller
Atmosera, Inc. Direct Reseller
Avtex Solutions Direct Reseller
Bio Automation Support Direct Reseller
Blackwood Associates, Inc. (dba BAI Federal) Direct Reseller
Blue Source Group, Inc. Direct Reseller
Blueforce Development Corporation Direct Reseller
Cambria Solutions, Inc. Direct Reseller
CDO Technologies Inc. Direct Reseller
CENTRALE LLC Direct Reseller
Centurylink Direct Reseller
cFocus Software Incorporated Direct Reseller
CGI Federal, Inc. Direct Reseller
Ciracom Inc. Direct Reseller
CodeLynx, LLC Direct Reseller
Competitive Innovations, LLC Direct Reseller
Computer Professionals International Direct Reseller
ConvergeOne Direct Reseller
Coretek Services Direct Reseller
CorpInfo Services Direct Reseller
Corporate Technologies LLC Direct Reseller
Cre8tive Technology Design Direct Reseller
CSRA LLC Direct Reseller
Datapipe Direct Reseller
Dataprise, Inc. Direct Reseller
DXL Enterprises, Inc. Direct Reseller
Dynamics Intelligence Inc. Direct Reseller
Enterprise Infrastructure Partners, LLC Direct Reseller
Enterprise Services LLC. Direct Reseller
Epoch Concepts Direct Reseller
FCN, Inc. Direct Reseller
General Dynamics Information Technology Direct Reseller
Global Justice Solutions, LLC Direct Reseller
Global Tech Inc. Direct Reseller
GovPlace Direct Reseller
Hanu Software Solutions Inc. Direct Reseller
Harmonia Holdings Group LLC Direct Reseller
Hendrix Corporation Direct Reseller
Hewlett Packard Enterprise Direct Reseller
I10 Inc Direct Reseller
I2, Inc Direct Reseller
i3 Business Solutions, LLC Direct Reseller
ImageSource Direct Reseller
Indicium Technologies Inc Direct Reseller
Info Gain Consulting LLC Direct Reseller
Inforeliance LLC Direct Reseller
InnovaSystems International Direct Reseller
Intelice Solutions, LLC Direct Reseller
ISC Direct Reseller
It1 Source LLC Direct Reseller
IV4, Inc Direct Reseller
JHC Technology, Inc. Direct Reseller
Keylogic Systems, Inc. Direct Reseller
KiZAN Technologies Direct Reseller
Lear360.com Direct Reseller
Leidos Direct Reseller
Liftoff, LLC Direct Reseller
Lucidius Group LLC Direct Reseller
M2 Technology, Inc. Direct Reseller
Magenium Solutions, LLC Direct Reseller
Managed Solution Direct Reseller
MetroStar Systems Inc. Direct Reseller
Mibura Inc. Direct Reseller
MIS Sciences Corp Direct Reseller
Mobomo, LLC Direct Reseller
MS Cloud Express, LLC Direct Reseller
Nanavati Consulting, Inc. Direct Reseller
New Tech Solutions, Inc. Direct Reseller
NewWave Telecom & Technologies, Inc Direct Reseller
Nubelity LLC Direct Reseller
NWN Corporation Direct Reseller
Olive + Goose Direct Reseller
OneNeck IT Solutions Direct Reseller
Onyx Point, Inc. Direct Reseller
Orion Communications, Inc. Direct Reseller
Paragon Software Solutions, Inc. Direct Reseller
Patrocinium Systems, Inc. Direct Reseller
Perrygo Consulting Group, LLC Direct Reseller
Pharicode LLC Direct Reseller
Picis Envision Direct Reseller
Pinao Consulting LLC Direct Reseller
Pitech Solutions Inc Direct Reseller
Planet Technologies Direct Reseller
Plexhosted LLC Direct Reseller
ProArch IT Solutions Direct Reseller
Protected Trust Direct Reseller
Re:discovery Software, Inc. Direct Reseller
Ricoh CSP Fed Direct Reseller
rmsource, Inc. Direct Reseller
Science Applications International Corporation Direct Reseller
Secure-24 Direct Reseller
Shadow-Soft, LLC. Direct Reseller
SHI International Corp Direct Reseller
Simons Advisors, LLC Direct Reseller
Smartronix Direct Reseller
Socius 1 LLC Direct Reseller
Softchoice Corporation Direct Reseller
SoftwareONE Inc. Direct Reseller
Static Networks, LLC Direct Reseller
StoneFly, Inc. Direct Reseller
Strategic Communications Direct Reseller
Stratus Solutions Direct Reseller
Summit 7 Systems, Inc. Direct Reseller
Surveillance and Cyber Security Solutions, LLC Direct Reseller
SWC Technology Partners Direct Reseller
Synoptek LLC Direct Reseller
Tech Data Government Solutions, LLC Direct Reseller
TechTrend Direct Reseller
ThunderCat Technology Direct Reseller
TIC Business Consultants, Ltd. Direct Reseller
Tier1, Inc. Direct Reseller
TKC Global Direct Reseller
U2Cloud LLC Direct Reseller
United Data Technologies, Inc. Direct Reseller
Vana Solutions LLC Direct Reseller
Vespa Group LLC Direct Reseller
Vistronix, LLC Direct Reseller
Wintellect, LLC Direct Reseller
WWT Direct Reseller
X-Centric IT Solutions, LLC Direct Reseller
XentIT, llc Direct Reseller
Xgility Direct Reseller
ZONES INC Direct Reseller
Arrow Enterprise Computing Solutions, Inc. Indirect Distributor
Crayon Indirect Distributor
Insight Public Sector Inc Indirect Distributor
SYNNEX Indirect Distributor
Tech Data Corporation Indirect Distributor

 

If you would like to learn more about the Cloud Solution Provider Program and answers to Commonly asked questions, you can do so here. If you would like to apply to the program, you can visit this link. If you are interested to deploy to our DoD regions via CSP, talk to your CSP Provider and they can enable that for you.

For any additional question reach out to Azure Government CSP.


Service Fabric 6.1 Refresh Three Release

4/3/2018 Practical DAX for Power BI wtih Phil Seamark

$
0
0

In this session Phil is going to show how easy it is to solve everyday reporting needs very easily with DAX -and not needing a bunch of training to do it!

 

Practical DAX for Power BI wtih Phil Seamark

Walk through of using DAX to create, shape and model data, highlighting helpful tips and tricks. DAX stands for Data Analysis Expressions, and it is the formula language used throughout Power BI (it is also used by Power BI behind the scenes). DAX is also found in other offerings from Microsoft, such as Power Pivot and SSAS Tabular, but this collection of Guided Learning topics focuses on how DAX is used - and can be used by you - in Power BI.

When: 4/3/2018 10AM PST

Where: https://www.youtube.com/watch?v=1fGfqzS37qs 

Philip Seamark

Phil is an author, Microsoft Data Platform MVP and an experienced database and business intelligence (BI) professional with a deep knowledge of the Microsoft B.I. stack along with extensive knowledge of data warehouse (DW) methodologies and enterprise data modelling.  He has 25+ years experience in this field and an active member of Power BI community.  Super User with 1,945 answers with 590 kudos on the community website http://community.powerbi.com 

http://radacad.com/philip-seamark 

Encrypting Data Disks on Linux with Azure Disk Encryption

$
0
0

There are two main ways to encrypt data disks using Azure Disk Encryption.  You can choose to thoroughly encrypt the existing disk block-by-block using EnableEncryption, or you can choose to rapidly format and encrypt the data disk using EncryptFormatAll.

Both of these techniques have a common setup requirement that will signal to the ADE solution which data disks are to be encrypted:

Prerequisite Data Disk Setup​

First, prior to enabling encryption, the data disks to be encrypted need to be properly listed in /etc/fstab. Take care to use a persistent block device name for this entry, as device names in the "/dev/sdX" format cannot be relied to be associated with the same disk across reboots particularly after encryption is applied. For more detail on this behavior, see:  https://docs.microsoft.com/en-us/azure/virtual-machines/linux/troubleshoot-device-names-problems

 

Next, ensure that the /etc/fstab settings are configured properly for mounting.  To do this, run the mount -a command or reboot the VM and trigger the remount that way.  Once that is complete, check the output of the lsblk command to verify that the desired drive is still mounted.  The reason for this test is simple. If the /etc/fstab file does not mount the drive properly prior to enabling encryption, ADE will not able to mount it properly either.  The ADE process will be moving the mount information out of /etc/fstab and into its own configuration file as part of the encryption process.  Do not be surprised to see the entry missing from /etc/fstab after data drive encryption completes.  Also be sure to give time after reboot for the ADE process to mount the newly encrypted disks.  They will not be immediately available after a reboot, the ADE process will need time to start, unlock, and then mount the encrypted drives prior to their being available for other processes to access.  This may take more than a minute after reboot depending on the system characteristics.

 

​A simple example of commands that can be used to mount the data disks and create the necessary /etc/fstab entries can be found here:

https://github.com/ejarvi/ade-cli-getting-started/blob/master/validate.sh#L197-L205

 

Once the data disk is ready, you can choose to either encrypt the existing data block-by-block using EnableEncryption, or you can choose to rapidly format and encrypt the data disks using EncryptFormatAll.

EnableEncryption (encrypt data drive contents, preserving content)

This process can be very time consuming depending on the size of the drive contents to encrypt, but it allows existing content to be encrypted.

Powershell:

To use this process, prepare the drive as above and then use the Powershell cmdlet Set-AzureRmVmDiskEncryptionExtension:
https://docs.microsoft.com/en-us/powershell/module/azurerm.compute/set-azurermvmdiskencryptionextension?view=azurermps-5.4.0
Encryption may take several hours if the "OS" or "All" VolumeType is selected.  If the "Data" VolumeType is selected then the time required will be proportional to the size of the data volume(s) to be encrypted.

CLI:

Prepare the drive as above, and then use the az vm encryption enable command.
An example bash script of how to enable encryption end to end that uses this command is available here: https://github.com/ejarvi/ade-cli-getting-started/blob/master/validate.sh
Please note that on successful completion of this script, the script will automatically delete the resources that it just created.  If you would like to preserve the resource, make sure to comment out or remove the last line of the script prior to running.

ARM Template:

Using an ARM template to encrypt a running Linux VM is possible.  Ensure the EncryptionOperation value is set to "EnableEncryption” - I don't have a pointer to a template that uses this yet, but for now modifying this template may be sufficient:

 

EncryptFormatAll

This process is very fast but is only for scenarios where there is no existing content on the mounted data drives.
After running this command any drives that were mounted previously will be formatted and then the encryption layer will be started on top of that now empty drive.
Additionally, when this option is selected, the "ephemeral" resource disk attached to the VM will also be encrypted.  If the ephemeral drive is reset, it will be reformatted and re-encrypted by the ADE solution in the VM at the next opportunity.
Some additional documentation of the EncryptFormatAll mode including powershell and ARM template examples is available at:

4/5 webinar Developing with Power BI Embedding – The April 2018 Update by Ted Pattison

$
0
0

Good friend and kindred spirit, Ted Pattison, joins me for a look at the updates for Developers in the world of Power Bi.

Developing with Power BI Embedding – The April 2018 Update

In this developer-oriented webinar, Ted Pattison will discuss recent enhancements to the Power BI embedding platform and how they can be leveraged by ISVs and enterprise developers. Ted will explain the differences between the two primary development models (user-owns-data vs. app-own-data) and he will also discuss when to use Power BI Premium versus when to use the Power BI Embedded service in Microsoft Azure. Attendees will learn essential programming skills for embedding reports and dashboards using the Power BI Service API together with the Power BI JavaScript API. Along the way, attendees will learn when and how to work with embed codes and how to leverage new Power BI embedding features such as the ability to embed an individual report visual in a custom application.

When 4/5/2018 10AM PST

Where: https://www.youtube.com/watch?v=swnGlrRy588

Ted Pattison

 

Ted Pattison is an author, instructor, co-founder and owner of Critical Path Training, a company dedicated to education on Power BI, Office 365 and SharePoint technologies. He is a 12-time recipient of Microsoft’s MVP award and for last three SharePoint releases Ted has worked with Microsoft’s Developer Platform Evangelism group researching and authoring training material for early adopters. Ted has already taught hundreds of professionals how to get started building custom business solutions using Microsoft technologies.

Hometown: Tampa, FL USA
Website: Visit Website 
Twitter: @TedPattison 

 

IntelliTrace and the ‘Magic’ of Historical Debugging

$
0
0

Premier Developer Consultant Lizet Pena De Sola discusses the Visual Studio Debugging and Diagnostic Tools Workshop, in which she discovered IntelliTrace and its magical debugging capabilities.


Lizet recently visited a development team in Nevada that was eager to learn more about Visual Studio debugging tools and the C# compiler Open Source project in GitHub, named ‘Roslyn’. The highlight of the sessions was the ‘discovery’ of IntelliTrace and how they could use this feature in improving the communication between the development team in Nevada and the QA team at another location.

A few hard to reproduce bugs had been filed recently, one of them being intermittent without a consistent set of steps to reproduce. This team was using process dumps and WinDbg to try and pinpoint the cause, but, even though process dumps have their reason of being, the size of the dump files made the quest to search for a root cause quite difficult.

This is until they tried IntelliTrace in their QA environments.

To learn how IntelliTrace improved this team’s debugging and QA, read the rest of the post on Lizet’s blog.

Viewing all 35736 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>