Quantcast
Channel: MSDN Blogs
Viewing all 35736 articles
Browse latest View live

Tutorial: Run Spring Boot web service in Windows Docker container

$
0
0

Premier Developer Consultant, Pete Tian demonstrates how to build a Spring Boot web project to run in Docker.


This tutorial walks you through: build a Boot web project, create a Dockerfile, build a Docker image, then run it in Docker.

Create a Spring Boot web service

First, let's generate a Spring Boot web project. Open a browser, go to http://start.spring.io, in Search for dependencies bar type Web, then click Generate Project button.

The generated source code is saved to a download folder as demo.zip. Extract it to your IDE's workspace. Import the project by using the wizard Existing Maven Projects.

Continue reading Tian’s post via LinkedIn.


Hitting Refresh on Getting Results the Agile Way

$
0
0

Agile Results is a personal productivity system for learners, leaders, achievers, and even productive artists.

With Agile Results you’ll improve your focus, master your motivation, develop a growth mindset, and use your strengths to compound your results.

It’s effectively a whole person approach to productivity to help you realize your full potential while making progress on meaningful results.

I did a major overhaul of Getting Results the Agile Way:

GettingResults.com

I made a lot of changes so I’ll try to summarize them here.

You’ll notice a very simple tag line:

“Better energy, better results.”

You’ll find it easier to get started with Agile Results:

Get Started with Agile Results

You’ll enjoy a much simpler menu experience that organizes the entire site:

Articles | Books | Courses | Topics | Resources

You should find it way easier now to browse resources by key topics.  I created Hub Pages for the following topics:

You’ll find it easy to take a quick tour of Agile Results.  Here is my quick tour guide:

Quick Tour of Agile Results

You will find resources galore:

There is even a manifesto for Agile Results:

Agile Results Manifesto

You will also find free eBooks:

30 Days of Getting Results Free eBook (PDF)

Getting Started with Getting Results the Agile Way Free eBook (PDF)

And there is even free training:

7 Day Agile Results Jumpstart

30 Days of Getting Results

Renew Dynamics 365 for Finance and Operations Certificate on Dev Machine

$
0
0

This was a internal request from support team to quickly fix the certificate expire issue. I would like to post it here in case you need it. Please note this should only apply to your Dev VHD, and strongly recommand you create a checkpoint before proceed.

One script for all steps(renew certificate,grant permission, replace in config, reset iis and batch)

Function Update-Thumberprint

{

    Set-Location -Path "cert:LocalMachineMy"

    $oldCerts = Get-childitem | where { $_.subject -match "DeploymentsOnebox" -or $_.Subject -match "MicrosoftDynamicsAXDSCEncryptionCert"}

    $ConfigFiles =

    @("C:AOSServicewebrootweb.config",

      "C:AOSServicewebrootwif.config",

      "C:AOSServicewebrootwif.services.config",

      "C:FinancialReportingServerApplicationServiceweb.config",

      "C:RetailServerwebrootweb.config"

      )

    foreach ($oldCert in $oldCerts)

    {

        $newCert = New-SelfSignedCertificate -CloneCert $oldCert

        #consider to delete the old cert

        $keyPath = Join-Path -Path $env:ProgramData -ChildPath "MicrosoftCryptoRSAMachineKeys"

        $keyName = $newCert.PrivateKey.CspKeyContainerInfo.UniqueKeyContainerName

        $keyFullPath = Join-Path -Path $keyPath -ChildPath $keyName

        $aclByKey = (Get-Item $keyFullPath).GetAccessControl('Access')

        $permission = "EveryOne","Read", "Allow"

        $accessRule = New-Object -TypeName System.Security.AccessControl.FileSystemAccessRule -ArgumentList $permission

        $aclByKey.SetAccessRule($accessRule)

        Set-Acl -Path $keyFullPath -AclObject $aclByKey -ErrorAction Stop

        foreach($configFile in $ConfigFiles)

        {

            (Get-Content -Path $configFile).Replace($oldCert.Thumbprint,$newCert.Thumbprint) | Set-Content $configFile

        }

    }

}

Update-Thumberprint

iisreset

Restart-Service "DynamicsAxBatch"

Please copy all the script and run in powershell via administrator previligge.

Each time you run this script, it will create a new set of certificates. So do not repeat it.

Hope it helps.

Friday Five: Microsoft Kaizala, Ignite 2018, and More!

$
0
0

Project Server 2016: Missing Alerts And Reminders Settings

Mohamed El-Qassas is a Microsoft MVP, SharePoint StackExchange Moderator, C# Corner MVP, TechNet Wiki Ninja, Blogger, and Senior Technical Consultant with +10 years of experience in SharePoint and Project Server. Check out his blog here

 

Microsoft Kaizala

John Naguib is an Office Servers and Services MVP. He's also a Solution Architect and Senior Consultant, with a deep knowledge of SharePoint. In addition, he has a strong .net application development background and is knowledgable in Office 365, Azure and several Microsoft products. John is a recognized expert within the IT industry, as he's published several gold award articles on Microsoft TechNet blogs and spoken at several events. He is based in Egypt. Follow him on Twitter @johnnaguib.

ASP.NET Core: Entity Framework Call Store Procedure

Asma Khalid is a Technical Evangelist, Technical Writer and Fanatic Explorer. She enjoys doodling with technologies, writing stories and sharing her knowledge with the community. Her core domain is Software Engineering, but she’s also experienced in Product Management, Product Monitoring, Product Implementation, Product Execution and Product Coordination. She is the 1st female from Pakistan to receive Microsoft Most Valuable Professional (MVP) recognition and the 1st female from Pakistan to receive C-sharp corner online developer community Most Valuable Professional (MVP) recognition. She has 6+ years of experience as IT professional, freelancer & entrepreneur. She is currently working on her entrepreneur venture, AsmaKart. Follow her on Twitter @asmak.

Make Windows Admin Center High available running on a Windows Server 2019 Cluster

Robert Smit is a EMEA Cloud Solution Architect at Insight.com and is a current Microsoft MVP Cloud and Datacenter as of 2009. Robert has over 20 years experience in IT with experience in the educational, health-care and finance industries. His past IT experience in the trenches of IT gives him the knowledge and insight that allows him to communicate effectively with IT professionals. Follow him on Twitter @Clustermvp

Ignite 2018 update - consistent labels

Albert Hoitingh lives in The Hague (The Netherlands) and works as an Office 365 solutions architect for Motion10. Albert’s core interest is everything to do with information management, (Azure) information protection, governance and security. Albert loves to enthusiast people by blogging and speaking in The Netherlands and beyond. Albert won the MVP award on Enterprise Mobility in April 2018. Follow Albert on Twitter @alberthoitingh or his blog.

 

DigitaleSchule.Bayern – der Lehrerkongress in der Realschule Gauting bringt digitale Medien für alle auf den Lehrplan

$
0
0

Das Jahr 2019 fängt digital an: Am 29. und 30. Januar findet in der Realschule Gauting bei München der Lehrerkongress „DigitaleSchule.Bayern“ statt. An beiden Tagen dreht sich dort alles um digitale Medien und wie sich der Unterricht in allen Schulformen wirkungsvoll unterstützen lässt. Dabei werden unterschiedliche Lösungen für alle Schulformen praxisnah vorgestellt.

Microsoft bietet auf dem Lehrerkongress an beiden Veranstaltungstagen in einem eigenen Raum unterschiedliche Workshops zu aktuellen Themen und Technologien an. Dabei ist Inklusion und wie man sie mit digitalen Medien und Technologien unterstützt eines der Schwerpunktthemen.

Assistive Tech im Schrift-Sprach-Erwerb oder wie man Inklusion digital unterstützt

Große Klassen, in denen Schülerinnen und Schüler mit unterschiedlichen Förderschwerpunkten zusammen lernen, sind eine Herausforderung für die Lehrkraft. Daher werden die Möglichkeiten, Lerntools von Microsoft zur Binnendifferenzierung im Schrift-Sprach-Erwerb einzusetzen, in einem speziellen Workshop auf dem Lehrerkongress DigitaleSchule.Bayern ausführlich beleuchtet.

"Inklusion setzt voraus, dass Schülerinnen und Schüler an ihren individuellen Förderschwerpunkten arbeiten können, und das möglichst in ihrem eigenen Rhythmus. Wenn diese individuelle Förderung ein struktureller Bestandteil des Unterrichtsgeschehens in der Klasse ist und Ergebnisse zusammengeführt werden, findet echte Binnendifferenzierung statt: lernerzentriert und kooperativ zugleich."
Arwen Schnack, Dozentin und Autorin für Deutsch als Fremdsprache

Vor allem Lese-Rechtschreib-Schwächen, häufige Schwierigkeiten langsamer Lerner und Lernerinnen sowie die Förderung im Bereich „Deutsch als Zweitsprache“ können mit einer speziellen Software, die in dem Workshop vorgestellt wird, unkompliziert und anwenderfreundlich verbessert werden.

Dabei richtet sich der Workshop an Unterrichtende aller Fächer, in denen Textarbeit eine Rolle spielt. Alle Teilnehmer sind zudem eingeladen, eigene Ideen vor Ort experimentell umzusetzen. Die Ergebnisse werden gemeinsam besprochen und dabei Möglichkeiten und Grenzen der vorgestellten Software erörtert.

Der Workshop findet an Tag 1 des Lehrerkongresses DigitaleSchule.Bayern am 29. Januar 2019 im Workshopslot 1 statt. Mehr Informationen sowie die Anmeldung finden Sie hier.

Mit der Turtle zu mehr Teilhabe: Programmieren in inklusiven Lernsettings

Programmieren ist DAS Zukunftsthema und wird auch in Bildungskontexten immer wichtiger. Von verschiedenen Seiten werden die Rufe lauter, Programmieren als weitere Fremdsprache in Schulen zu etablieren. Doch welche Programmiertools eignen sich wirklich im Rahmen von Bildungsprozessen und wie kann man Mädchen und Jungen gleichermaßen fürs Programmieren begeistern? Und geht das auch in inklusiven Lernsettings? Was macht man als Lehrkraft, wenn man selbst gar nicht programmieren kann?

Unterstützungsangebote für Lehrkräfte

Die Bildungsinitiative Code your Life stellt sich dieser Herausforderung und bietet Fachkräften ein breites Unterstützungsangebot, um mit Kindern bereits ab der dritten Klasse spannende und kreative Programmiererlebnisse zu schaffen. Im Workshop bekommen die Teilnehmerinnen und Teilnehmer einen Einblick, wie Programmieren niedrigschwellig und anschaulich vermittelt werden kann. Immer mit einer gewissen Portion Herausforderung und garantiertem Wow-Effekt.

Im Zentrum steht dabei die Programmiersprache Logo und die App TurtleCoder, mit der Schritt für Schritt grundlegende Prinzipien des Programmierens erlernt und algorithmische Strukturen identifiziert werden können. Mithilfe einer digitalen Schildkröte, ersten Befehlen und ein wenig Experimentierfreude entstehen bereits nach kürzester Zeit beeindruckende selbst-programmierte Kunstwerke.

Der Workshop findet an Tag 1 des Lehrerkongresses DigitaleSchule.Bayern am 29. Januar 2019 im Workshopslot 2 statt. Mehr Informationen sowie die Anmeldung finden Sie hier.

Lesson Learned #50: Adding PK led to transaction log full error

$
0
0

Hello,

It is common to have this type of cases where our customer has millions of rows in a table and when they need to add a column and they reached the error transaction log full.

Depending on the Azure SQL Database tier you have two important factors to be aware of for having this process completed: Transaction Log Size and IOPS for the transaction log.

 

It is very important to remark the following points about the transaction log behavior in Azure SQL Database Engine:

  • All databases have Full Recovery method and there is not possible to change it. That means that all transactions will be recorded in the transaction log.
  • Every 5-10 minutes Azure SQL Database Engine performs a Transaction Log backup in order to prevent problems with the transaction log.
  • Azure SQL Database Engine will shrink the transaction log automatically.
  • The data space used by the database is not considered the transaction log space used.

 

In this situation, our customer got the transaction log full error adding a new PK column in their table. But, before solving the issue let me explain why:

  • Every time that you add a new column, a new table will be created, SQL Server will import the data from the old table to new table and after this process, the new table, SQL Server needs to rebuild the indexes, etc. in a unique transaction, for this reason, you will have blocking issues, high IO and DATAIO consumption the transaction log will have the double/three times of space (we have a full recovery method), etc..

 

Let me show you an example and how to fix it:

    • I created a table call DatosSource that has two fields, ID (int), Data (nchar(10))
    • I changed the field ID from Int to Bigint and as you could see the SQL Server Engine does. As you could see in everything in an single transaction and all the data will be included in the transaction log (Full recovery model and there is not possible to change it). Although I could use Azure SQL Auditing, I used SQL Server OnPremise to show you there is the same that Azure SQL Database using SQL Server Profiler.
      • Create a new table call TMP_DatosSource
      • Block all the table for new updates.
      • Insert the data into TMP_DatosSource from DatosSource
      • Drop the DatoSource table
      • And rename TMP_DatosSource to DatosSource again.
      • Rebuild indexes if needed of the new DatosSource table

    • Even, our customer scaling up the database to Premium P15 they got the issue. Our customer didn't scale up to HyperScale database tier. So, my suggestion is to reduce the transaction log migrating the data in batches, reducing the size of the transaction and creating multiple transactions.
    • I create a new temporal table call TMP_DatosSource with bigint and data (nchar(10))
    • I inserted this number of rows:

    • DECLARE @Times as int = 0
      while @times <=100000000
      begin
      set @times=@times+1
      insert into [dbo].[DatosSource] (Id,Data) values(@times,1)
      end
    • As an example, this is the best way to transfer data without causing a high consumption in your Transaction log and temporal database. My suggestion is to use a DECLARE CURSOR, you could modify the SELECT command in order to filter some specific data in order to reduce the number of rows to transfer.  Every 50000 rows I executed a COMMIT TRANSACTION and initiate a new TRANSACTION. Running this query  you could see the space usage. Once the process has been finished you could rename the table and create the indexes and so on to this new table.

        • DECLARE @id int, @name nchar(10)
          DECLARE @Total as bigint = 0
          DECLARE @FirstTime as bit = 0
          TRUNCATE TABLE [TMP_DatosSource]
          DECLARE vcursor CURSOR FOR SELECT ID, Data FROM DatosSource
          OPEN vcursor
          FETCH NEXT FROM vcursor
          INTO @id, @name
          WHILE @@FETCH_STATUS = 0
          BEGIN
          If @FirstTime=0
          BEGIN
          BEGIN TRANSACTION
          SET @FirstTime=1
          END
          SET @TOTAL=@TOTAL+1
          INSERT INTO [dbo].[TMP_DatosSource] VALUES(@ID, @NAME)
          FETCH NEXT FROM vcursor  INTO @id, @name
          IF @TOTAL >50000
          BEGIN
          SET @TOTAL=0
          COMMIT TRANSACTION
          BEGIN TRANSACTION
          END
          END
          COMMIT TRANSACTION
          CLOSE vcursor;
          DEALLOCATE vcursor;

    Enjoy!

    The case of the orphaned critical section despite being managed by an RAII type

    $
    0
    0


    Some time ago,
    I was enlisted to help debug an elusive deadlock.
    Studying a sampling of process memory dumps led
    to the conclusion that a critical section had been orphaned.
    Sometimes, the thread that owned the critical section had
    already exited,
    but sometimes the thread was still running,
    but it was running code completely unrelated to the critical section.
    It was as if the code that acquired the critical section had
    simply forgotten to release it before returning.



    The thing is,
    all attempts to acquire the critical section were managed
    by an RAII type,
    so there should be no way that the critical section could have
    been forgotten.
    And yet it was.



    When would the destructor for an RAII object by bypassed?
    One possibility is that somebody did an
    Exit­Thread
    or (horrors)
    Terminate­Thread.
    But this doesnt match the evidence,
    because as noted above,
    in some of the crash dumps,
    the critical section owner is still alive and running,
    but unaware that it owns the critical section.



    On all platforms other than x86,
    exception unwind information is kept in
    tables that are kept in a rarely-used portion of the image,
    so that we don't waste memory on exception unwind information
    until an exception actually occurs:
    When an exception occurs, the system pages in the unwind tables
    and does a table lookup to see which unwind handler should run.
    But on x86, the exception unwind state is maintained manually in the code.
    This is a bad thing for x86 performance,
    but a good thing for getting inside the head of the compiler.



    Bonus reading:

    Unwinding the Stack: Exploring How C++ Exceptions Work on Windows
    .



    James McNellis
    , CppCon 2018


    The unwind checkpoint is a 32-bit value,
    usually stored at [ebp-4].
    The compiler uses it to keep track of what needs to get unwound
    if an exception occurs.
    If the compiler can deduce that
    no exception can occur between two checkpoints,
    then it can optimize out the first checkpoint.



    There are four functions that enter the critical section in question.
    The code that does so looks like this:


    {
    auto guard = SystemChangeListenerCS.Lock();
    ... some code ...
    } // guard destructor releases the lock


    Finding the exact point where the guards are created is made easier with
    the assistance of the # debugger command,
    which means "Disassemble until you see this string in the disassembly."




    0:000> #SystemChangeListenerCS SystemChangeListenerThreadProc
    SystemChangeListenerThreadProc+0x7c:
    1003319c mov ecx,offset SystemChangeListenerCS (100b861c)
    0:000>


    Okay, so the debugger found a line of assembly that
    mentions System­Change­Listener­CS.
    Let's look to see whether there is an unwind checkpoint after
    the lock is taken.



    0:000> u 1003319c
    ChangeMonitorThreadProc+0x7c:
    1003319c mov ecx,offset contoso!SystemChangeListenerCS (100b861c)
    100331a1 push eax
    100331a2 call Microsoft::WRL::Wrappers::CriticalSection::Lock (1002a863)
    100331a7 mov byte ptr [ebp-4],5


    We see that immediately after acquiring the lock,
    the code updates [ebp-4] to remember
    that it needs to destruct the lock guard
    in case an exception occurs.



    Exercise:
    I said that the unwind state is recorded in a 32-bit value
    stored at [ebp-4],
    but the code here updates only a byte.
    Why only a byte?



    The lock is acquired again later in that same function,
    so we'll search some more.
    If you leave off the second parameter to the # command,
    it continues searching where the previous search left off.



    0:000> #SystemChangeListenerCS
    SystemChangeListenerThreadProc+0x487:
    100335a7 mov ecx,offset contoso!SystemChangeListenerCS (100b861c)
    0:000> u 100335a7
    contoso!SystemChangeListenerThreadProc+0x487:
    100335a7 mov ecx,offset contoso!SystemChangeListenerCS (100b861c)
    100335ac push eax
    100335ad call Microsoft::WRL::Wrappers::CriticalSection::Lock (1002a863)
    100335b2 mov byte ptr [ebp-4],0Dh


    Okay, so this lock guard is also marked for unwinding.



    The next function that uses the critical section
    is Reset­Widgets.



    0:000> #SystemChangeListenerCS ResetWidgets
    ResetWidgets+0x133:
    10033fcc mov ecx,offset SystemChangeListenerCS (100b861c)
    0:000> u 10033fcc l4
    ResetWidgets+0x133:
    10033fcc mov ecx,offset SystemChangeListenerCS (100b861c)
    10033fd1 push eax
    10033fd2 call Microsoft::WRL::Wrappers::CriticalSection::Lock (1002a863)
    10033fd7 call Microsoft::WRL::ComPtr<IStream>::Reset (10039932)
    10033fdc call Microsoft::WRL::ComPtr<Widget>::Reset (10039142)
    10033fe1 cmp dword ptr [ebp-4Ch],0
    10033fe5 je ResetWidgets+0x157 (10033ff0)
    10033fe7 push dword ptr [ebp-4Ch]


    Hm, this function doesn't create an unwind checkpoint after
    taking the lock.
    This means that the compiler believes that no exception can occur
    between the point the guard is created and the next thing that
    would require updating the unwind checkpoint
    (in our case, that would be the point the lock is destructed).



    We repeat this analysis with the other two functions.
    One of them creates an unwind checkpoint; the other doesn't.



    Why does the compiler believe that no exceptions can occur
    in the guarded block?
    Well, inside the block it calls

    ComPtr::Reset

    twice,
    and it does some other stuff.
    The Reset method is declared like this:



    template<typename T>
    class ComPtr {
    unsigned long Reset() { return InternalRelease(); }
    unsigned long InternalRelease() throw() { ... }
    ...
    };


    Observe that

    the Internal­Release method

    uses the deprecated throw() specifier,
    which says that the method never throws an exception.
    The compiler then inferred that the Reset method
    also never throws an exception, since it does nothing that could
    result in an exception.



    This code was compiled before the Microsoft C++ compiler
    added the /std:C++17 switch,
    so it uses

    the old rules for the throw() specifier
    ,
    which for the Microsoft C++ compiler boils down to
    "I'm trusting you never to throw an exception."



    My theory is that the Reset actually did throw
    an exception.
    Since the compiler didn't create an unwind checkpoint,
    the lock guard did not get unwound.
    The exception was caught higher up the stack,
    so the process didn't crash.



    Digging into the two objects wrapped inside the ComPtr
    revealed that the first one was a Widget­Monitor
    object.



    Exercise:
    The first was really an IWidget­Monitor interface,
    so why did it disassemble as
    ComPtr<IStream>?



    The Widget­Monitor's destructor went like this:



    WidgetMonitor::~WidgetMonitor()
    {
    Uninitialize();
    }

    void WidgetMonitor::Uninitialize()
    {
    blah blah;
    ThrowIfFailed(m_monitor.Deactivate());
    blah blah;
    ThrowIfFailed(m_monitor.Disconnect());
    blah blah;
    }



    Now you see the problem.
    If the Uninitialize method throws an exception,
    the exception will propagate out of the destructor.
    (This code is so old that it predates C++11's rule
    that destructors are noexcept
    by default where possible.)
    And then it will propagate out of
    ComPtr::Internal­Release,
    and then out of ComPtr::Reset,
    and then out of Reset­Widgets.
    And unwinding out of Reset­Widgets
    will not run the lock guard's destructor because the compiler
    assumed that no exception could be thrown,
    thanks to the throw() specifier on the
    ComPtr::Internal­Release method.



    As is often the case, it's usually a lot easier to find something
    once you know what you're looking for.
    The team dug into its telemetry to see that, yes indeed,
    the systems that encountered the problem had also thrown an
    exception from
    Widget­Monitor::Uninitialize,
    thus confirming the theory.



    Now they could work on fixing the problem:
    Fix the destructor so it doesn't throw any exceptions.
    In this specific case,
    the exception was thrown because they were deactivating
    an object that hadn't been fully activated.
    Since

    cleanup functions cannot fail
    ,
    the best you can do is to

    just soldier on and clean up as much as you can
    .

    Lesson Learned #51: Managed Instance – Import via Sqlpackage.exe doesn’t allow autogrow

    $
    0
    0

    Some days ago, I have been working in a service request that our customer faced the following issue:

    • They've downloaded the newest version of Sqlpackage.exe tool (18.0) and they're trying to import a bacpac into their Azure SQL Managed Instance. It has started successfully but it's hanging while it reaches 32,768 MB size of primary filegroup of imported DB and the import process raised the following error:

    ' Could not allocate a new page for database 'customer_database' because of insufficient disk space in filegroup 'PRIMARY'. Create the necessary space by dropping objects in the filegroup, adding additional files to the filegroup, or setting autogrowth on for existing files in the filegroup'.

    Let me explain that when you create a database in Azure SQL Managed Instance the autogrowth is enabled but limited in space, so, you need to change it as follows:

    • Using SQL Server Management Studio and properties of the database ->Files, I was able to change the value of this max growth


    Enforcing security controls right from CI/CD pipeline with AzSK – Deep Dive

    $
    0
    0

    Azure Security Kit  aka AzSK is a framework that is used internally by Microsoft to control & govern their Azure Subscriptions. While some features are overlapping with Azure Security Center, I find a lot of value in the Kit, mostly in the following areas:

    The attestation module allowing for a full traceability of security controls deviation and justification of why a given control was not respected, which may be very useful in case of internal/external audit
    The CI/CD extensions available on the marketplace. This makes possible to enforce security controls as from CI builds, so very early in the application lifecycle. On top of Azure DevOps extensions, the kit also ships with Visual Studio extensions to provide security guidance as the developer is typing the code.
    A lot of room for customizing control settings, creating organization specific controls, etc.
    It is free, the only costs you might have would be incurred by a few resources (Storage Account & Automation) that are required to use the kit but overall, it is very low.
    The kit comes with other things such as Continuous Assurance (CA) & integration with OMS but that's where there is a bit of overlapping with Azure Security Center (ASC). The latter being much easier to use but less customizable and does not cover CI/CD. AzSK's documentation is rather good but it still takes a bit of time to get acquainted, especially in the CI/CD domain as this is still a work in progress. I dogged into the source code of AzSK and more particularly into the ARM Library Checker (.NET & .Net core module) in order to understand exactly how it works and that's what I'm going to share here.

    That said, you should already know how to create a custom org policy and setup CA, etc. as I'm not going to go through all the details. I guess the official doc all in all is certainly about 50 pages. If you don't know AzSK at all, you may still read this article but it's going to be harder to grasp it correctly.

    Step 1 - working with the Azure DevOps tasks
    Currently, the extension comes with 2 tasks, the AzSK_ARMTemplateChecker used to inspect and validate ARM template files and AzSK_SVTs, used to validate the overall security of the resource group that is bound to the current release or use a tagname/value pair that identifies resources used and deployed by your application. AzSK_ARMTemplateChecker  is meant to be used as a pre-deployment scan while AzSK_SVTs is meant to be used as a post-deployment scan. Even when you use AzSK_ARMTemplateChecker, it is a good idea to also validate the overall solution with AzSK_SVTs as you might have tasks in your release pipeline that could modify some resources not via ARM templates but rather via Powershell, Azure Cli, etc., making a final scan with AzSK_SVTs necessary.

    Those two tasks are using cmdlets behind the scenes which can also be used directly by sys admins. The configuration of the tasks is explained in the AzSK doc but I'd like to come with extra details that are not discussed in the official doc. For the time being, the ARM checker task does not work with parameters which means that it expects to have the actual parameter values as part of the template. So, for instance, one of the built-in controls checks how many instances of a given app service plan runs. The control doing this is the following:

    
    {
    "id": "AppService270",
    "apiVersions": [ "2016-09-01" ],
    "controlId": "Azure_AppService_BCDR_Use_Multiple_Instances",
    "isEnabled": true,
    "description": "App Service must be deplsoyed on a minimum of 1 instance to ensure availability",
    "rationale": "App Service deployed on multiple instances ensures that the App Service remains available even if an instance is down.",
    "recommendation": "Run command 'Set-AzureRmAppServicePlan -Name '&amp;amp;amp;lt;AppServicePlanName&amp;amp;amp;gt;' -ResourceGroupName '&amp;amp;amp;lt;RGName&amp;amp;amp;gt;' -NumberofWorkers '&amp;amp;amp;lt;NumberofInstances&amp;amp;amp;gt;''. Run 'Get-Help Set-AzureRmAppServicePlan -full' for more help.",
    "severity": "High",
    "jsonPath": [ "$.sku.capacity" ],
    "matchType": "IntegerValue",
    "data": {
    "type": "GreaterThan",
    "value": 1
    }
    }
    
    

    as you can see, the jsonPath attribute contains the relative path to the property of the corresponding resource in the controlled ARM template, which is typically something like this:

    
    "resources": [
    {
    ...
    "sku": {
    "name": "[parameters('skuName')]",
    "capacity": "[parameters('skuCapacity')]"
    },
    ...
    }
    
    

    where the value of "capacity" is supposed to come from the "skuCapacity" parameter. However, if you ask the ARM checker task (as well as the cmdlet) to check the template "as is", it will fail because the condition in the control states that the value should be greater than 1, meaning that the library will compare "[parameters('skuCapacity')]" > 1 which will of course fail. Since hardcoding values directly in the template isn't CI/CD friendly, the easiest way to workaround this is to simply make a configuration transformation step as follows:

    where you can specify which transformation to apply. In this case, I'm asking the task to take my first resource and replace the value of the sku capacity by the value of my release variable, and that will do the trick. For sake of simplicity, I'm illustrating this with only this parameter but you should of course do that with all the parameters.

    Next you can simply use the AzSK_ARMTemplateChecker task that is easy to configure. Now, I had hard times finding how to specify which controls should to be taken into account by the task. Although the official documentation is quite exhaustive, there are still some grey zones and I had hard times making it work because there is currently (MS is working on that) a bug (specific to ARM checker) with this feature and that's why it didn't work for me from the start.

    Step 2 - controls being checked by the ARM Checker
    By default, if you haven't setup a custom organization policy, the controls that will be checked by AzSK are retrieved from the default online policy store: https://azsdkossep.azureedge.net/1.0.0/ARMControls.json .

    The file contains a series of resources which in turn contain a series of controls. The control I showed earlier (AppService270) is part of this file under the AppService resource. So, by default, all controls that are applicable to resources that are part of your ARM template will be executed. If you want to use your own controls, you first need to use the latest version of AzSK (currently 3.9.0), then you have to take a copy of ARMControls.json, modify it as you wish and upload it to your own online policy store. That's what I did but still, I thought it didn't work because of the bug I was referring to earlier: the control's isEnabled attribute is currently ignored by the ARM Checker (but works fine with SVT...) so I thought the system was not picking up my control file. Then I jumped to my self-hosted build agent and started debugging and troubleshooting with Fiddler and I noticed that my file was well retrieved. So, I ended up cloning the AzSK repo and made a step by step debugging of the ARM checker library and realized that this property was just ignored. So, I fixed that in my own local environment and MS should release a fix ASAP.

    Once you uploaded your own version of ARMControls.json, you also have to specify how you want to use it in ServerConfigMetadata.json, and this is either like this:

    {
    "Name" : "ARMControls.json",
    "OverrideOffline":true
    }
    
    

    either like this:

     { "Name" : "ARMControls.json"} 

    With the first option, you tell AzSK that you want to use only your file. The second option tells AzSK to overlay the out of the box ARMControls with yours which allows you to only specify the delta. Up to you to decide which one you prefer.

    Then, you may change existing controls and even add your own. If we come back to the control I showed earlier:

     
    { 
    "id": "AppService270",
    "apiVersions": [ "2016-09-01" ], 
    "controlId": "Azure_AppService_BCDR_Use_Multiple_Instances",
     "isEnabled": true, 
    "description": "App Service must be deplsoyed on a minimum of 1 instance to ensure availability", 
    "rationale": "App Service deployed on multiple instances ...",
    "recommendation": "Run command 'Set-AzureRmAppServicePlan...", 
    "severity": "High", 
    "jsonPath": [ "$.sku.capacity" ], 
    "matchType": "IntegerValue",
     "data": { "type": "GreaterThan",
     "value": 1 } 
    } 

    we could enable it or disable it (again not working right now), decide that GreaterThan 0 would be more suitable for a development environment as you don't especially want to pay for multiple instances in dev, change the description, severity etc. So, except the JsonPath attribute that you should never change, you can do whatever you want with the rest.

    If you want to add your own controls, as for instance enforcing MSI or making sure ARR Affinity is disabled, you could simply add the following JSON body under the AppService resource:

    
    {
    "controlID": "Azure_AppService_ClientAffinity_SEY_0001",
    "description": "ARR Affinity must be turned off for App Service;",
    "id": "AppService_Client_Affinity_SEY_0001",
    "severity": "Medium",
    "recommendation": "Go to Azure Portal --&amp;amp;gt; your App Service --&amp;amp;gt; Settings --&amp;amp;gt; Application Settings --&amp;amp;gt; ARR Affinity --&amp;amp;gt; Click on 'OFF'.",
    "jsonPath": [ "$.properties.clientAffinityEnabled" ],
    "matchType": "Boolean",
    "data": {
    "value": true
    },
    "isEnabled": false,
    "rationale": "ARR Affinity makes it easier to drive DOS/DDOS attacks against the underlying resource"
    }
    
    

    This control is not enforced by AzSK right now but I personally find ARR Affinity may pose a security risk since when turned on, the webapp will return a cookie named ARRAffinity containing the specific backend instance ID to which Azure should target client requests. That makes it easy for attackers to overload always the same instance until it is down and then get a fresh ARRCookie etc. until all the instances are down. In any case, ARR Affinity is often required by poorly written applications that were not correctly designed to run on multiple nodes. So, the earlier such issues are discovered (in the CI pipeline), the better!

    Step 3 - controls being checked by the SVT task
    As a reminder, the SVT task is to be used in a post-deployment phase so for instance, it could be the last task of a release pipeline. SVT works differently than ARM Checker although the ultimate purpose is the same. For SVT, all the control files live in the [AzSK Module Path][AzSK version]FrameworkConfigurationsSVTServices and each resource has its own corresponding JSON file. If we look into AppService.json, we'll find a similar control to the one we worked with so far:

    
    {
    "ControlID": "Azure_AppService_BCDR_Use_Multiple_Instances",
    "Description": "App Service must be deployed on a minimum of two instances to ensure availability",
    "Id": "AppService270",
    "ControlSeverity": "Medium",
    "Automated": "Yes",
    "MethodName": "CheckAppServiceInstanceCount",
    "Rationale": "App Service deployed on&amp;amp;nbsp;....",
    "Recommendation": "Run command ...",
    "Tags": [
    "SDL",
    "TCP",
    "Automated",
    "BCDR",
    "AppService"
    ],
    "Enabled": true,
    "FixControl": {
    "FixMethodName": "SetMultipleInstances",
    "FixControlImpact": "Medium"
    }
    }
    
    

    but as you might have noticed, there is no JsonPath attribute nor any rule to define the behavior of the control. Instead, there is the MethodName attribute which indicates which PowerShell method should be called when this control is being executed.  In this case, the method CheckAppServiceInstanceCount of the resource AppService should be called but where is the code defined? Well, simple: in the AppService.ps1  ([AzSK Module Path][AzSK version]FrameworkCoreSVTServices) script that defines the AppService class and sure enough, if you look for CheckAppServiceInstanceCount into that file, you'll find its implementation.

    That said, you should *never* change any of the local files. For SVT controls, you can change some settings in the ControlSettings.json file but if you want to change the behavior or a given control, or add a new control, you'll have to extend the resource. For all the details, you should read the official doc but I'm going to show you how I extended SVT to create the ARR Affinity check in a similar way as for the ARM checker. Since ClientAffinityEnabled (displayed as ARR Affinity in the portal) is a property of an Azure webapp, I extended the AppService feature by adding two new files to my org policy folder:

    AppService.ext.json that contains the declarative definition of the control
    AppService.ext.ps1 that contains the method to call when the control executes
    The contents of AppService.ext.json are:

    
    {
    "FeatureName": "AppService",
    "Reference": "aka.ms/azsktcp/appservice",
    "IsMaintenanceMode": false,
    "Controls": [
    {
    "ControlID": "Azure_AppService_ClientAffinity_SEY_0001",
    "Description": "ARR Affinity must be turned off for App Service",
    "Id": "AppService_Client_Affinity_SEY_0001",
    "ControlSeverity": "Medium",
    "Automated": "Yes",
    "MethodName": "CheckAppServiceClientAffinityEnabled",
    "Rationale": "ARR Affinity makes it easier to drive DOS/DDOS attacks against the underlying resource",
    "Recommendation": "Go to Azure Portal ...",
    "Tags": [
    "SDL",
    "TCP",
    "Automated",
    "Config",
    "AppService",
    "FunctionApp"
    ],
    "Enabled": true
    }
    ]
    
    }
    
    

    where the most important part is the method name CheckAppServiceClientAffinityEnabled that is implemented in AppService.ext.ps1:

    #using namespace Microsoft.Azure.Commands.AppService.Models
    Set-StrictMode -Version Latest
    class AppServiceExt: AppService
    {
     hidden [PSObject] $ResourceObject;
     hidden [PSObject] $WebAppDetails;
     hidden [PSObject] $AuthenticationSettings;
     hidden [bool] $IsReaderRole;
    AppServiceExt([string] $subscriptionId, [string] $resourceGroupName, [string] $resourceName):
    Base($subscriptionId, $resourceGroupName, $resourceName)
    {
    $this.GetResourceObject();
    $this.AddResourceMetadata($this.ResourceObject.Properties)
    }
     AppServiceExt([string] $subscriptionId, [SVTResource] $svtResource):
      Base($subscriptionId, $svtResource)
     {
      $this.GetResourceObject();
      $this.AddResourceMetadata($this.ResourceObject.Properties)
     }
     hidden [ControlResult] CheckAppServiceClientAffinityEnabled([ControlResult] $controlResult)
     {
      if([Helpers]::CheckMember($this.WebAppDetails,"ClientAffinityEnabled"))
      {
       if($this.WebAppDetails.ClientAffinityEnabled)
       {
        $controlResult.AddMessage([VerificationResult]::Failed,
        [MessageData]::new("ARR Affinity for resource " + $this.ResourceContext.ResourceName + " is turned on", ($this.WebAppDetails | Select-Object ClientAffinityEnabled)));
       }
       else
       {
        $controlResult.AddMessage([VerificationResult]::Passed,
        [MessageData]::new("ARR Affinity for resource " + $this.ResourceContext.ResourceName + " is turned off", ($this.WebAppDetails | Select-Object ClientAffinityEnabled)));
       }
     }
     else
     {
       $controlResult.AddMessage([VerificationResult]::Manual,
       [MessageData]::new("Could not validate ARR Affinity settings on the AppService: " + $this.ResourceContext.ResourceName + "."));
     }
     return $controlResult;
    }
    }

    in which I simply check the value of the ClientAffinityEnabled webapp property and I return the control result (Passed, Failed, Manual) accordingly. Once you've uploaded your extension files to your org policy storage account, you also have to list them in ServerConfigMetadata.json.

    Step 4 - different policies for different environments?
    Depending on the control baseline (which I'm not covering in this article) you want to define, you might decide to use a single corporate policy for all your subscriptions and environments but you might also want to define different policies per environment. For instance, with the example I talked about so far (multiple instances enforced by the control), you might want the control to behave differently whether you are in dev (where a single instance is enough) or in staging/prod where you should indeed have more than 1 instance to be HA.

    No matter what you decide and independently of AzSK, the way your environments are organised (1 vs multiple subscriptions) will also influence the way you'll tackle your policies. AzSK supports to have either one policy store per subscription, either one global store. However, I noticed that this doesn't matter too much, especially when it comes to CI/CD as the underlying tasks will trigger a fresh PowerShell window for every execution and target a given policy store using the Set-AzSKPolicySettings cmdlet. So, if you decide to work with only one policy store cross-environments (DEV/STAGING/PROD), you might simply define your build/release variables this way:

    scoped at release level and then let your different stages use the same settings. However, should you use different policy stores, you may define them like this:

    in this case I defined AzSKServerUrl twice with different values and scoped respectively to the DEV & STAGING stages. My org policy storage account looks like this:

    So, I simply created one container per environment and each container contains its own control files:

    Hope this helps!

    ¡Resuelve ser un "aprende todo" con las nuevas certificaciones de Azure!

    $
    0
    0

    ¡Feliz año nuevo! Ojalá no sea el primero en desearte eso, especialmente si estás leyendo esto en julio. Un nuevo año está asociado con la posibilidad de comenzar de nuevo y cambiar de alguna manera para mejorar. Ya sea perder peso, leer más, aprender un nuevo idioma o revitalizar su carrera, embarcarse en un viaje de superación personal es siempre una gran inversión. En esta publicación, lo alentaremos a elegir una ocasión histórica e invertir en sí mismo al buscar una de las nuevas certificaciones basadas en roles de Microsoft Azure.

    En Microsoft, uno de nuestros principios culturales clave es adoptar una mentalidad de crecimiento. Nuestro director general, Satya Nadella, alienta la importancia de ser un "aprende todo" en lugar de ser un "sabelotodo". En el campo de la tecnología del futuro, se estima que el 65% de los estudiantes actuales trabajarán en empleos que no existen en la actualidad. Aprendiendo rápida y continuamente como aprender será un diferenciador clave en la evolución y el progreso profesional.

    Si bien el aprendizaje es una recompensa en sí mismo, llevar las cosas un paso más allá y validar su aprendizaje con una certificación puede generar beneficios tangibles. Al obtener una certificación, el 23% de los tecnólogos certificados de Microsoft informaron haber recibido aumentos salariales, algunos de ellos de hasta un 20%. Además, a los empleados certificados a menudo se les confía la supervisión o la tutoría de sus compañeros y/o los posicionan para la promoción.

    ¿Convencido? ¡Genial! Con las recientemente renovadas certificaciones de Microsoft Azure para enfocarse en roles específicos, ahora es el momento de flexionar esos músculos de aprendizaje. Si ha detectado el error de "aprenda todo", por todos los medios, procure cualquiera de las certificaciones basadas en roles que le interesen. Sin embargo, si trabaja en el desarrollo de aplicaciones, le recomiendo obtener las certificaciones Ingeniero Experto Certificado en Microsoft Azure DevOps o Desarrollador Asociado Certificado en Microsoft Azure.

    La ruta para obtener la certificación de Desarrollador Asociado en Azure ha cambiado recientemente, según las opiniones recibidas de los exámenes beta de AZ-200 y AZ-201 y las lecciones aprendidas. El nuevo examen, AZ-203: Desarrollo de soluciones para Microsoft Azure, será el único examen necesario. Los candidatos para la certificación Desarrollador Asociado en Azure se probarán en el desarrollo de infraestructuras y plataformas en Azure como soluciones de servicios de computo, desarrollar para el almacenamiento en la nube, implementar seguridad en Azure, conectar y consumir los servicios de Azure y servicios de terceros, y monitorear, resolver y optimizar Soluciones en Azure.

    La certificación del ingeniero DevOps de Azure se puede obtener al aprobar un solo examen, AZ-400: Soluciones Microsoft Azure DevOps. AZ-400 es el primer examen de Microsoft que se enfoca exclusivamente en DevOps. Este examen mide el conocimiento del candidato sobre las prácticas de DevOps que combinan personas, procesos y tecnologías para entregar continuamente productos y servicios valiosos que satisfacen las necesidades del usuario final y los objetivos comerciales. Este examen completó recientemente su fase Beta y se está preparando para su lanzamiento. Debería de estar disponible en general a finales de enero.

    Una vez que haya seleccionado su ruta de certificación, ¿cómo se prepara? Las siguientes son algunas de las muchas opciones y recursos para ayudarlo en su preparación:

    Ya sea Año Nuevo, su cumpleaños o solo un lunes, comience de nuevo y conviértase en un "aprende todo" al obtener una certificación basada en roles de Microsoft Azure.

    Resolve to be a “learn it all” with new Azure Certifications!

    $
    0
    0

    Contributors: Mike Lapierre, David Lipien, Leonel Mora, Doug Owens


    Happy New Year! Hopefully I'm not the first to wish you that, especially if you are reading this in July. A new year is associated with the chance to start fresh and resolve to change in some way for the better. Whether it’s losing weight, reading more, learning a new language, or revitalizing your career, embarking on a self-improvement journey is always a great investment. In this post, we are going to encourage you to pick a landmark occasion and invest in yourself by pursuing one of Microsoft's new Azure role-based certifications.

    At Microsoft, one of our key cultural principles is to embrace a growth mindset. Our CEO, Satya Nadella, encourages the importance of being a "learn-it-all" as opposed to being a "know-it-all". In the technology field of the future, it is estimated that 65% of current students will work in jobs that don't exist today. Quickly and continually learning how to learn will be a key differentiator in career evolution and advancement.

    While learning is a reward unto itself, taking things a step further and validating your learning with a certification can yield tangible benefits. Upon earning a certification, 23% of Microsoft certified technologists report receiving salary increases, some of up to a 20%. In addition, certified employees are often entrusted with supervising or mentoring their peers, included in strategic discussions, and/or promoted.

    Convinced? Great! With Microsoft recent revamping of its Azure certifications to focus on specific roles, now is the time to flex those learning muscles. If you've caught the "learn-it-all" bug, by all means pursue any of the role-based certifications that interest you. However, if you work in application development, I recommend pursuing the Microsoft Certified Azure Developer Associate or Microsoft Certified Azure DevOps Engineer Expert certifications.

    The path for earning the Azure Developer Associate certification has recently changed, based on AZ-200 and AZ-201 beta exam feedback and lessons learned. The new exam, AZ-203: Developing Solutions for Microsoft Azure, will be the only exam necessary. Candidates for the Azure Developer Associate certification will be tested on developing Azure infrastructures and platform as Service Compute Solutions, developing for Azure storage, implementing Azure security, connecting to and consuming Azure Services and third-party services, and monitoring, troubleshooting, and optimizing Azure solutions.

    The Azure DevOps Engineer certification can be earned by passing a single exam, AZ-400: Microsoft Azure DevOps Solutions. AZ-400 is the first Microsoft exam to focus exclusively on DevOps. This exam measures the candidate’s knowledge of DevOps practices which combine people, process, and technologies to continuously deliver valuable products and services that meet end user needs and business objectives. This exam recently completed its beta phase and is being prepared for release. It should be generally available by the end of January.

    Once you’ve selected your certification path, how do you get ready? The following are a few of many options and resources to aid your preparation:

    Whether it's New Year’s Day, your birthday, or just a Monday, start fresh and become a "learn-it-all" by earning a Microsoft Azure role-based certification!

    Devenez un « j’apprends tout » avec les nouvelles certifications Azure!

    $
    0
    0

    Bonne année! J’espère ne pas être le premier à vous le souhaiter, particulièrement si vous lisez cet article en juillet. Une nouvelle année est une opportunité de partir à neuf et d’introduire des changements positifs. Que ce soit de perdre du poids, de lire plus fréquemment, d’apprendre une nouvelle langue ou de revitaliser votre carrière, l’amélioration de soi est toujours un bon investissement. Dans cet article, nous vous encourageons à profiter de l’une de ces occasions pour investir en vous-mêmes en poursuivant une des nouvelles certifications Azure basées sur les rôles de Microsoft.

    Chez Microsoft, un de nos principes culturels clés est de favoriser une mentalité de croissance. Notre PDG, Satya Nadella, met l’emphase sur l’importance d’être un « j’apprends tout » plutôt qu’un « je sais tout ». Dans le domaine des technologies du futur, on estime que 65% des étudiants actuels évolueront dans des emplois qui n’existent pas encore aujourd’hui. La capacité d’apprendre rapidement et continuellement sera un élément différenciateur dans l’évolution et l’avancement d’une carrière.

    Bien que d’apprendre est récompense en soi, faire un pas de plus et valider son apprentissage avec une certification peut amener des bénéfices tangibles. Après avoir obtenu une certification, 23% des technologues certifiés Microsoft ont signalé une augmentation de salaire, certaines allant jusqu’à 20%. De plus, les employés certifiés se voient souvent confier des rôles de supervision ou de mentorat de leurs pairs, en plus d’être inclus dans les discussions stratégiques et/ou d’obtenir une promotion.

    Convaincu? Excellent! Avec Microsoft qui vient de renouveler ses certifications Azure pour mettre l’accent sur des rôles spécifiques, il est maintenant temps d’exercer vos muscles d’apprentissage. Si vous êtes atteint de la fièvre du « j’apprends tout », empressez-vous de poursuivre la certification basée sur le rôle qui vous anime le plus. Cela dit, si vous œuvrez dans le développement d’applications, je vous recommande la certification Microsoft Certified Azure Developer Associate ou Microsoft Certified Azure DevOps Engineer Expert.

    Le parcours pour obtenir la certification Azure Developer Associate a récemment changé, basé sur les commentaires et les leçons apprises des examens AZ-200 et AZ-201 en préversion. Le nouvel examen, AZ-203 : Solutions Microsoft Azure Developer, sera le seul examen nécessaire. Les candidats de la certification Azure Developer Associate seront évalués sur le développement de solutions basées sur les services d’infrastructure et de plateforme en tant que service Azure, le développement pour le stockage Azure, la mise en place de la sécurité Azure, la connectivité et la consommation des services Azure et tierces parties, et la surveillance, le dépannage et l’optimisation des solutions Azure.

    La certification Azure DevOps Engineer peut être obtenue en passant un seul examen, le AZ-400: Microsoft Azure DevOps Solutions. AZ-400 est le premier examen de Microsoft à mettre l’accent uniquement sur le DevOps. Cet examen mesure les connaissances du candidat des pratiques DevOps qui combinent les personnes, les processus et les technologies pour livrer continuellement des produits et services de valeur qui rencontrent les besoins des utilisateurs finaux et les objectifs d’affaires. L’examen a récemment complété la phase de préversion et en cours de préparation. La version finale devrait être disponible d’ici la fin janvier.

    Une fois le parcours de certification choisi, comment peut-on se préparer? Voici quelques choix parmi une multitude d’options et de ressources pour aider votre préparation :

    Que ce soit une nouvelle année, votre anniversaire ou simplement un lundi, partez à neuf et devenez un « j’apprends tout » en obtenant une certification Azure basée sur les rôles de Microsoft!

    Moving the cheese – StevenLasker.blog

    $
    0
    0

    Microsoft is retiring personal msdn blogs. As a result, I've moved all my blog content to https://StevenLasker.blog

    I hope you'll keep the great feedback coming there.

    Steve

     

    2018 year-end link clearance

    $
    0
    0


    Viewing and Sorting XEvents Efficiently (Code Samples) – XEProfiler

    $
    0
    0

    I was doing backups and clean-ups and ran across a couple of sample projects for XEvent and event_sequence processing I thought others might find helpful. – Enjoy!

    The sample code is provided "as is" and any express or implied warranties, including the implied warranties of merchantability and fitness for a particular purpose, are disclaimed.


    In my previous blog posts, I highlighted the importance of using Event Sequence as the sort key.

    How It Works: XEvent Output and Visualization
    SQL Server Management Studio Provides–“XE Profiler”
    Use the SSMS XEvent Profiler

    Some of the feedback has been that sorting can be a cumbersome operation because:

    • Using sys.fn_xe_file_target_read_file the XML has to be queried and indexed which can take time and space, followed by the order by event sequence query
    • Using SQL Server Management Studio (SSMS) and storing the events in a table requires reading the events, streaming to table storage followed by an order by event sequence query
    • Large traces can encounter the 32-bit SSMS memory limitations

    Several years ago, I created a XEProfiler based loosely on the SSMS implementation, built as a 64-bit application to avoid the 32-bit memory limitations and using parallel activities to speed up the processing.

    image

    The XEProfiler Sample

    ·         Creates an empty DataGrid control

    ·         Makes a pass over the files to obtain the Event Locators

    ·         Sorts the EventLocators

    ·         Leverages the display location to paint the information on the screen

    Note: By tracking and using EventLocators, the overall memory is reduced, and the 64-bit address space allows for a larger set of trace data.

    clip_image002

    TSQL Sample Session

    CREATE EVENT SESSION [EventSequence] ON SERVER
    ADD EVENT sqlserver.sql_batch_completed
    (
        ACTION(package0.event_sequence,sqlserver.client_app_name,sqlserver.session_id
    )),
    ADD EVENT sqlserver.sql_batch_starting
    (
        ACTION(package0.event_sequence,sqlserver.client_app_name,sqlserver.session_id
    ))
    ADD TARGET package0.event_file(SET filename=N'c:tempEventSequence',max_file_size=(1024),max_rollover_files=(100
    ))
    WITH
    (MAX_MEMORY=1024 MB,EVENT_RETENTION_MODE=ALLOW_SINGLE_EVENT_LOSS,MAX_DISPATCH_LATENCY=30 SECONDS,MAX_EVENT_SIZE=
    0
    KB
    ,MEMORY_PARTITION_MODE=PER_CPU,TRACK_CAUSALITY=ON,STARTUP_STATE=OFF
    )

    Form1.cs

    using System;
    using System.Collections.Generic;
    using System.Drawing;
    using System.IO;
    using System.Linq;
    using System.Threading;
    using System.Threading.Tasks;
    using System.Windows.Forms;
    using Microsoft.SqlServer.XEvent.Linq; 

    namespace XEProfiler
    {
        public partial class xeProfilerForm : Form
        {
            // Event locator index needs
            //
            private class LocationInfo
            {
                public QueryableXEventData Stream
                {
                    get;
                    private set;
                } 

                public EventLocator Location
                {
                    get;
                    private set;
                } 

                public ulong Sequence
                {
                    get;
                    private set;
                } 

                // File can consume memory - for sample purposes only
                //
                public string File
                {
                    get;
                    private set;
                } 

                public LocationInfo(string file, ulong sequence, EventLocator location, QueryableXEventData stream)
                {
                    File = file;
                    Sequence = sequence;
                    Location = location;
                    Stream = stream;
                }
            } 

            private SolidBrush textBrush;
            private Font textFont;
            private StringFormat stringFormat = new StringFormat();
            private LocationInfo[] sortedLocators = null; 

            // Constructor to make display loop a bit like SQL Profiler
            //
            public xeProfilerForm()
            {
                InitializeComponent();
                stringFormat.Alignment = StringAlignment.Near;
                stringFormat.LineAlignment = StringAlignment.Near;
                stringFormat.FormatFlags = StringFormatFlags.NoWrap; 

                textBrush = new SolidBrush(Color.Black);
                textFont = new System.Drawing.Font("Courier New", 8.25F, System.Drawing.FontStyle.Regular, System.Drawing.GraphicsUnit.Point, ((byte)(0))); 

                dataGrid.RowHeadersWidth = 10;
                dataGrid.AllowUserToAddRows = false;
                dataGrid.CellPainting += DataGrid_CellPainting;
            } 

            // TODO: Make this meta data driven with targets for fields vs actions to remove need for switch statement
            //
            string[] ColIndexNames = { "timestamp", "event_sequence", "name", "batch_text", "event_file" }; 

            // Draw the proper text for the current display region
            //
            private void DataGrid_CellPainting(object sender, DataGridViewCellPaintingEventArgs e)
            {
                if (e.ColumnIndex > -1 && e.RowIndex > -1)
                {
                    e.PaintBackground(e.ClipBounds, false);
                    LocationInfo info = sortedLocators[e.RowIndex];
                    PublishedEvent pubEvent = info.Stream.EventProvider.RetrieveEvent(info.Location);
                    PublishedAction action;
                    PublishedEventField field;

                    string strText = string.Empty; 

                    switch (e.ColumnIndex)
                    {
                        case 0:
                            strText = pubEvent.Timestamp.UtcDateTime.ToString();
                            break; 

                        case 1:
                            if (pubEvent.Actions.TryGetValue(ColIndexNames[e.ColumnIndex], out action))
                            {
                                strText = action.Value.ToString();
                            }
                            break; 

                        case 2:
                            strText = pubEvent.Name;
                            break; 

                        case 3:
                            if (pubEvent.Fields.TryGetValue(ColIndexNames[e.ColumnIndex], out field))
                            {
                                strText = field.Value.ToString();
                            }
                            break; 

                        case 4:
                            strText = info.File;
                            break;
                    }              

                    e.Graphics.DrawString(strText, textFont, textBrush, e.CellBounds, stringFormat);
                    e.Handled = true;
                }
            } 

            // Load the event locators and activate the display
            //
            private void Form1_Load(object sender, EventArgs e)
            {

                dataGrid.Enabled = false;
                string[] files = Directory.GetFiles(@"c:temp", @"EventSequence*", SearchOption.TopDirectoryOnly); 

                object locationLock = new object();
                int rows = 0;
                List<LocationInfo> eventLocators = new List<LocationInfo>(); 

                // Load the EventLocators in parallel
                //
                Parallel.ForEach (files, file  =>
                {
                    LoadEvents(ref rows, ref locationLock, ref eventLocators, file);
                }); 

                // Get array sorted by sequence values, allowing parallel Linq activities
                //
                sortedLocators = (from locator in eventLocators select locator)
                                     .OrderBy(x => x.Sequence)
                                     .ToArray(); 

                SetRowCount(rows, true);
            } 

            // Load the events into the list to be sorted and used as index for display
            //
            private void LoadEvents(ref int rows, ref object locationLock, ref List<LocationInfo> eventLocators, string file)
            {
                string fileName = Path.GetFileNameWithoutExtension(file);

                // Open and track the stream so we can use later to access event data
                //
                QueryableXEventData stream = new QueryableXEventData(file); 

                foreach (PublishedEvent pubEvent in stream)
                {
                    // Sequence may have holes due to options such as Event Loss
                    // All events must be read before final sort can be closed.
                    //
                    ulong sequence = (ulong)pubEvent.Actions["event_sequence"].Value;
                    EventLocator locator = pubEvent.Location;
                    LocationInfo info = new LocationInfo(fileName, sequence, locator, stream); 

                    lock (locationLock)
                    {
                        eventLocators.Add(info);
                    } 

                    // Simple progress
                    //
                    int rowCount = Interlocked.Increment(ref rows);
                    if (rowCount % 128 == 0)
                    {
                        SetRowCount(rowCount, false);
                    }
                }
            } 

            // Once row count is complete display headers (TODO: based on metadata)
            // and update the rowcount in status bar
            //
            private void SetRowCount(int rows, bool finished)
            {
                toolStripRowCount.Text = String.Format("{0}", rows);
                toolStripRowCount.Invalidate(); 

                if (finished)
                {
                    textBoxDisplay.Text = String.Empty;
                    dataGrid.Columns.Clear(); 

                    DataGridViewColumn newColumn = new DataGridViewTextBoxColumn();
                    newColumn.Name = "timestamp";
                    newColumn.HeaderText = "timestamp";
                    newColumn.ToolTipText = "Timestamp of the captured event";
                    dataGrid.Columns.Add(newColumn); 

                    newColumn = new DataGridViewTextBoxColumn();
                    newColumn.Name = "sequence";
                    newColumn.HeaderText = "sequence";
                    newColumn.ToolTipText = "Event sequence of the captured event";
                    dataGrid.Columns.Add(newColumn); 

                    newColumn = new DataGridViewTextBoxColumn();
                    newColumn.Name = "name";
                    newColumn.HeaderText = "name";
                    newColumn.ToolTipText = "Name of the captured event";
                    dataGrid.Columns.Add(newColumn); 

                    newColumn = new DataGridViewTextBoxColumn();
                    newColumn.Name = "batch_text";
                    newColumn.HeaderText = "batch_text";
                    newColumn.ToolTipText = "Batch text";
                    newColumn.Width = 200;
                    dataGrid.Columns.Add(newColumn); 

                    newColumn = new DataGridViewTextBoxColumn();
                    newColumn.Name = "file";
                    newColumn.HeaderText = "file";
                    newColumn.ToolTipText = "Event source file";
                    newColumn.Width = 200;
                    dataGrid.Columns.Add(newColumn); 

                    //  Change the dataGrid to an OwnerDraw (OnPaint) so we only have to
                    //  draw the data and not really load it into the data grid rows
                    //
                    dataGrid.RowCount = rows;
                    if (rows > 0)
                    {
                        dataGrid.Rows[1].Selected = true;
                    } 

                    dataGrid.Enabled = true;
                    dataGrid.Invalidate();
                    dataGrid.Refresh();
                }
            } 

            // Batch text display
            //
            private void dataGrid_SelectionChanged(object sender, EventArgs e)
            {
                textBoxDisplay.Text = String.Empty; 

                if (0 != dataGrid.SelectedRows.Count)
                {
                    List<DataGridViewRow> orderedSelection = (from DataGridViewRow row in dataGrid.SelectedRows select row)
                                                              .OrderBy(x=>x.Index)
                                                              .ToList(); 

                    foreach (DataGridViewRow selectedRow in orderedSelection)
                    {
                        if (textBoxDisplay.Text != String.Empty)
                        {
                            textBoxDisplay.Text += "rngorn";
                        } 

                        LocationInfo info = sortedLocators[selectedRow.Index];
                        PublishedEvent pubEvent = info.Stream.EventProvider.RetrieveEvent(info.Location);
                        PublishedEventField field;
                        string strText = string.Empty; 

                        if (pubEvent.Fields.TryGetValue("batch_text", out field))
                        {
                            strText = field.Value.ToString();
                        } 

                        textBoxDisplay.Text += strText;
                    }
                }
            }
        }
    }
     

    Form1.Designer.cs

    namespace XEProfiler
    {
        partial class xeProfilerForm
        {
            /// <summary>
            /// Required designer variable.
            /// </summary>
            private System.ComponentModel.IContainer components = null;

            /// <summary>
            /// Clean up any resources being used.
            /// </summary>
            /// <param name="disposing">true if managed resources should be disposed; otherwise, false.</param>
            protected override void Dispose(bool disposing)
            {
                if (disposing && (components != null))
                {
                    components.Dispose();
                }
                base.Dispose(disposing);
            }

            #region Windows Form Designer generated code

            /// <summary>
            /// Required method for Designer support - do not modify
            /// the contents of this method with the code editor.
            /// </summary>
            private void InitializeComponent()
            {
                System.Windows.Forms.DataGridViewCellStyle dataGridViewCellStyle1 = new System.Windows.Forms.DataGridViewCellStyle();
                System.Windows.Forms.DataGridViewCellStyle dataGridViewCellStyle2 = new System.Windows.Forms.DataGridViewCellStyle();
                System.ComponentModel.ComponentResourceManager resources = new System.ComponentModel.ComponentResourceManager(typeof(xeProfilerForm));
                this.dataGrid = new System.Windows.Forms.DataGridView();
                this.statusBar = new System.Windows.Forms.StatusStrip();
                this.labelRowCount = new System.Windows.Forms.ToolStripStatusLabel();
                this.toolStripRowCount = new System.Windows.Forms.ToolStripStatusLabel();
                this.menuStrip1 = new System.Windows.Forms.MenuStrip();
                this.fileToolStripMenuItem = new System.Windows.Forms.ToolStripMenuItem();
                this.editToolStripMenuItem = new System.Windows.Forms.ToolStripMenuItem();
                this.viewToolStripMenuItem = new System.Windows.Forms.ToolStripMenuItem();
                this.toolsToolStripMenuItem = new System.Windows.Forms.ToolStripMenuItem();
                this.windowToolStripMenuItem = new System.Windows.Forms.ToolStripMenuItem();
                this.helpToolStripMenuItem = new System.Windows.Forms.ToolStripMenuItem();
                this.textBoxDisplay = new System.Windows.Forms.TextBox();
                this.splitContainer1 = new System.Windows.Forms.SplitContainer();
                ((System.ComponentModel.ISupportInitialize)(this.dataGrid)).BeginInit();
                this.statusBar.SuspendLayout();
                this.menuStrip1.SuspendLayout();
                ((System.ComponentModel.ISupportInitialize)(this.splitContainer1)).BeginInit();
                this.splitContainer1.Panel1.SuspendLayout();
                this.splitContainer1.Panel2.SuspendLayout();
                this.splitContainer1.SuspendLayout();
                this.SuspendLayout();
                //
                // dataGrid
                //
                this.dataGrid.AllowUserToAddRows = false;
                this.dataGrid.AllowUserToDeleteRows = false;
                this.dataGrid.AllowUserToOrderColumns = true;
                this.dataGrid.ColumnHeadersHeightSizeMode = System.Windows.Forms.DataGridViewColumnHeadersHeightSizeMode.AutoSize;
                dataGridViewCellStyle1.Alignment = System.Windows.Forms.DataGridViewContentAlignment.MiddleLeft;
                dataGridViewCellStyle1.BackColor = System.Drawing.SystemColors.Window;
                dataGridViewCellStyle1.Font = new System.Drawing.Font("Courier New", 8.25F, System.Drawing.FontStyle.Regular, System.Drawing.GraphicsUnit.Point, ((byte)(0)));
                dataGridViewCellStyle1.ForeColor = System.Drawing.SystemColors.ControlText;
                dataGridViewCellStyle1.SelectionBackColor = System.Drawing.SystemColors.Highlight;
                dataGridViewCellStyle1.SelectionForeColor = System.Drawing.SystemColors.HighlightText;
                dataGridViewCellStyle1.WrapMode = System.Windows.Forms.DataGridViewTriState.False;
                this.dataGrid.DefaultCellStyle = dataGridViewCellStyle1;
                this.dataGrid.Dock = System.Windows.Forms.DockStyle.Fill;
                this.dataGrid.Location = new System.Drawing.Point(0, 0);
                this.dataGrid.Name = "dataGrid";
                this.dataGrid.ReadOnly = true;
                dataGridViewCellStyle2.Alignment = System.Windows.Forms.DataGridViewContentAlignment.MiddleLeft;
                dataGridViewCellStyle2.BackColor = System.Drawing.SystemColors.Control;
                dataGridViewCellStyle2.Font = new System.Drawing.Font("Courier New", 8.25F, System.Drawing.FontStyle.Regular, System.Drawing.GraphicsUnit.Point, ((byte)(0)));
                dataGridViewCellStyle2.ForeColor = System.Drawing.SystemColors.WindowText;
                dataGridViewCellStyle2.SelectionBackColor = System.Drawing.SystemColors.Highlight;
                dataGridViewCellStyle2.SelectionForeColor = System.Drawing.SystemColors.HighlightText;
                dataGridViewCellStyle2.WrapMode = System.Windows.Forms.DataGridViewTriState.True;
                this.dataGrid.RowHeadersDefaultCellStyle = dataGridViewCellStyle2;
                this.dataGrid.RowTemplate.Height = 18;
                this.dataGrid.Size = new System.Drawing.Size(1089, 277);
                this.dataGrid.TabIndex = 0;
                this.dataGrid.SelectionChanged += new System.EventHandler(this.dataGrid_SelectionChanged);
                //
                // statusBar
                //
                this.statusBar.Items.AddRange(new System.Windows.Forms.ToolStripItem[] {
                this.labelRowCount,
                this.toolStripRowCount});
                this.statusBar.Location = new System.Drawing.Point(0, 479);
                this.statusBar.Name = "statusBar";
                this.statusBar.Size = new System.Drawing.Size(1089, 22);
                this.statusBar.TabIndex = 1;
                this.statusBar.Text = "statusBar";
                //
                // labelRowCount
                //
                this.labelRowCount.Name = "labelRowCount";
                this.labelRowCount.Size = new System.Drawing.Size(41, 17);
                this.labelRowCount.Text = "Rows: ";
                //
                // toolStripRowCount
                //
                this.toolStripRowCount.Name = "toolStripRowCount";
                this.toolStripRowCount.Size = new System.Drawing.Size(108, 17);
                this.toolStripRowCount.Text = "toolStripRowCount";
                //
                // menuStrip1
                //
                this.menuStrip1.Items.AddRange(new System.Windows.Forms.ToolStripItem[] {
                this.fileToolStripMenuItem,
                this.editToolStripMenuItem,
                this.viewToolStripMenuItem,
                this.toolsToolStripMenuItem,
                this.windowToolStripMenuItem,
                this.helpToolStripMenuItem});
                this.menuStrip1.Location = new System.Drawing.Point(0, 0);
                this.menuStrip1.Name = "menuStrip1";
                this.menuStrip1.Size = new System.Drawing.Size(1089, 24);
                this.menuStrip1.TabIndex = 2;
                this.menuStrip1.Text = "menuStrip1";
                //
                // fileToolStripMenuItem
                //
                this.fileToolStripMenuItem.Name = "fileToolStripMenuItem";
                this.fileToolStripMenuItem.Size = new System.Drawing.Size(37, 20);
                this.fileToolStripMenuItem.Text = "File";
                //
                // editToolStripMenuItem
                //
                this.editToolStripMenuItem.Name = "editToolStripMenuItem";
                this.editToolStripMenuItem.Size = new System.Drawing.Size(39, 20);
                this.editToolStripMenuItem.Text = "Edit";
                //
                // viewToolStripMenuItem
                //
                this.viewToolStripMenuItem.Name = "viewToolStripMenuItem";
                this.viewToolStripMenuItem.Size = new System.Drawing.Size(44, 20);
                this.viewToolStripMenuItem.Text = "View";
                //
                // toolsToolStripMenuItem
                //
                this.toolsToolStripMenuItem.Name = "toolsToolStripMenuItem";
                this.toolsToolStripMenuItem.Size = new System.Drawing.Size(47, 20);
                this.toolsToolStripMenuItem.Text = "Tools";
                //
                // windowToolStripMenuItem
                //
                this.windowToolStripMenuItem.Name = "windowToolStripMenuItem";
                this.windowToolStripMenuItem.Size = new System.Drawing.Size(63, 20);
                this.windowToolStripMenuItem.Text = "Window";
                //
                // helpToolStripMenuItem
                //
                this.helpToolStripMenuItem.Name = "helpToolStripMenuItem";
                this.helpToolStripMenuItem.Size = new System.Drawing.Size(44, 20);
                this.helpToolStripMenuItem.Text = "Help";
                //
                // textBoxDisplay
                //
                this.textBoxDisplay.Dock = System.Windows.Forms.DockStyle.Fill;
                this.textBoxDisplay.HideSelection = false;
                this.textBoxDisplay.Location = new System.Drawing.Point(0, 0);
                this.textBoxDisplay.Multiline = true;
                this.textBoxDisplay.Name = "textBoxDisplay";
                this.textBoxDisplay.ReadOnly = true;
                this.textBoxDisplay.ScrollBars = System.Windows.Forms.ScrollBars.Both;
                this.textBoxDisplay.Size = new System.Drawing.Size(1089, 174);
                this.textBoxDisplay.TabIndex = 3;
                //
                // splitContainer1
                //
                this.splitContainer1.Dock = System.Windows.Forms.DockStyle.Fill;
                this.splitContainer1.Location = new System.Drawing.Point(0, 24);
                this.splitContainer1.Name = "splitContainer1";
                this.splitContainer1.Orientation = System.Windows.Forms.Orientation.Horizontal;
                //
                // splitContainer1.Panel1
                //
                this.splitContainer1.Panel1.Controls.Add(this.dataGrid);
                this.splitContainer1.Panel1MinSize = 150;
                //
                // splitContainer1.Panel2
                //
                this.splitContainer1.Panel2.Controls.Add(this.textBoxDisplay);
                this.splitContainer1.Panel2MinSize = 150;
                this.splitContainer1.Size = new System.Drawing.Size(1089, 455);
                this.splitContainer1.SplitterDistance = 277;
                this.splitContainer1.TabIndex = 4;
                //
                // xeProfilerForm
                //
                this.AutoScaleDimensions = new System.Drawing.SizeF(6F, 13F);
                this.AutoScaleMode = System.Windows.Forms.AutoScaleMode.Font;
                this.ClientSize = new System.Drawing.Size(1089, 501);
                this.Controls.Add(this.splitContainer1);
                this.Controls.Add(this.statusBar);
                this.Controls.Add(this.menuStrip1);
                this.Icon = ((System.Drawing.Icon)(resources.GetObject("$this.Icon")));
                this.MainMenuStrip = this.menuStrip1;
                this.Name = "xeProfilerForm";
                this.Text = "XEProfiler";
                this.WindowState = System.Windows.Forms.FormWindowState.Maximized;
                this.Load += new System.EventHandler(this.Form1_Load);
                ((System.ComponentModel.ISupportInitialize)(this.dataGrid)).EndInit();
                this.statusBar.ResumeLayout(false);
                this.statusBar.PerformLayout();
                this.menuStrip1.ResumeLayout(false);
                this.menuStrip1.PerformLayout();
                this.splitContainer1.Panel1.ResumeLayout(false);
                this.splitContainer1.Panel2.ResumeLayout(false);
                this.splitContainer1.Panel2.PerformLayout();
                ((System.ComponentModel.ISupportInitialize)(this.splitContainer1)).EndInit();
                this.splitContainer1.ResumeLayout(false);
                this.ResumeLayout(false);
                this.PerformLayout();

            }

            #endregion

            private System.Windows.Forms.DataGridView dataGrid;
            private System.Windows.Forms.StatusStrip statusBar;
            private System.Windows.Forms.ToolStripStatusLabel labelRowCount;
            private System.Windows.Forms.ToolStripStatusLabel toolStripRowCount;
            private System.Windows.Forms.MenuStrip menuStrip1;
            private System.Windows.Forms.ToolStripMenuItem fileToolStripMenuItem;
            private System.Windows.Forms.ToolStripMenuItem editToolStripMenuItem;
            private System.Windows.Forms.ToolStripMenuItem viewToolStripMenuItem;
            private System.Windows.Forms.ToolStripMenuItem toolsToolStripMenuItem;
            private System.Windows.Forms.ToolStripMenuItem windowToolStripMenuItem;
            private System.Windows.Forms.ToolStripMenuItem helpToolStripMenuItem;
            private System.Windows.Forms.TextBox textBoxDisplay;
            private System.Windows.Forms.SplitContainer splitContainer1;
        }
    }

    MergeXEvents

    MergeXEvents consumes XEvent files, using optimized parallel sort algorithms and outputting events into the physical file in event_sequence order.  This was a project I created in 2014 to help Microsoft SQL Server support process incoming XEvent files, including the ability to merge XEvent files from multiple servers such as replication or high availability XEvent capture.  The project uses parallel threads to perform scatter/gather activities as well as tournament sorting to optimize the sorting and merging activities.

    Program.cs

    /*
    *      Designed to merge XEvents from multiple files into a single stream sorted by Event Sequence, Time and Server name
    *     
    *      This allows capture of large XEvent traces, using per CPU partitioning, and then post sorting in sequence order so the
    *      stream is in physical sequence order.
    *
    *      Arrays are owned by ## of partitions and dynamically expanded as events are read and added to them
    *      Once all data is read each array is sorted in ascending or decending order (by a a group of parallel workers)
    *      Once sorted the tournament sort is used by a single reader to gather the array streams and write to the output file
    *     
    *      Output File[n..] <----- Parallel Task [Ordered Output Array Segment 1] |                                                           | <----- Array[1]       
    *                                                                             | <---- FWriteEntry <-- Gather Streams  <-- Parellel sorts  | <----- Array[2] <----- Distribute streams | Reader
    *      Output File[n..] <----- Parallel Task [Ordered Output Array Segment N] |                                                           | <----- Array[3]
    *
    *      NOTE:  Testing shows parallel Output files do not have performance gains.  The original EventProvider, which is not thread safe,
    *              prevents you opening and using a second copy of the input stream and a second EventProvider and adding lock() activities between the
    *              RetreiveObject(EventLocator) and the Serialize causes more context swithing than just doing the work on a single thread.
    * */

    #define SEPERATE_OUTPUT

    using System;
    using System.Reflection;
    using System.Collections;
    using System.Collections.Generic;
    using System.Text;
    using System.Threading.Tasks;
    using Microsoft.SqlServer.XEvent.Linq;
    using System.IO;
    using VersionInfo;
    using System.Security.AccessControl;
    using System.Runtime.InteropServices;
    using ScatterGatherParallelNS;
    using System.Threading;

    namespace SortManager
    {
        abstract class ISortLogOutput
        {
            abstract public void LogOutputMessage(string strMsg);
        }

    #region DynamicList
        public sealed class TDynamicList<_T> : IDisposable
        {
            internal TDynamicList()
            {
                m_Items = new List<_T>();
            }

            public void Dispose()
            {
                if (null != m_Items)
                {
                    m_Items.Clear();
                }

                m_Items = null;
            }

            internal void Add(_T newEntry)
            {
                lock (m_Items)
                {
                    m_Items.Add(newEntry);
                }
            }

            internal bool FMoveNext()
            {
                return m_Walker.MoveNext();
            }

            internal _T GetCurrentEntry
            {
                get
                {
                    System.Diagnostics.Debug.Assert(null != m_Walker.Current);
                    return m_Walker.Current;
                }
            }

            internal bool FHasData
            {
                get
                {
                    return null != m_Walker.Current;
                }
            }

            internal Int32 StoredItems
            {
                get { return m_Items.Count; }
            }

            internal void Sort(IComparer<_T> comparer)
            {
                if (null != m_Items && m_Items.Count > 0)
                {
                    m_Items.Sort(comparer);
                }

                m_Walker = m_Items.GetEnumerator();
            }

            //      Members
            List<_T>                  m_Items;
            List<_T>.Enumerator       m_Walker;
        }
    #endregion

    #region DynamicArray
        public sealed class TDynamicArray<_T> : IDisposable
        {
            internal TDynamicArray()
            {
                m_Items = ArrayList.Synchronized(new ArrayList());
                m_iCurrentReadPos = -1;
            }

            public void Dispose()
            {
                if (null != m_Items)
                {
                    m_Items.Clear();
                }

                m_iCurrentReadPos = -1;
                m_Items = null;
            }

            internal void Add(_T newEntry)
            {
                m_Items.Add(newEntry);
            }

            internal bool FMoveNext()
            {
                m_iCurrentReadPos++;
                return m_iCurrentReadPos < m_Items.Count;
            }

            internal _T GetCurrentEntry
            {
                get
                {
                    return (_T) m_Items[m_iCurrentReadPos];
                }
            }

            public _T Item(int iItem)
            {
                return (_T) m_Items[iItem];
            }
     
            internal Int32 StoredItems
            {
                get { return m_Items.Count;  }
            }

            internal int Count
            {
                get { return m_Items.Count;  }
            }
           
            //      Members
            ArrayList           m_Items;
            System.Int32        m_iCurrentReadPos;           //     Current reader position 
        }
    #endregion

    #region SortMatchTree
        internal class TSortMatchingTree<_T> : IDisposable
        {
            internal TSortMatchingTree(TDynamicList<_T>[] partitions, bool bAscending = true)
            {
                m_bAscending = bAscending;
                m_Partitions = partitions;

                m_bWinners = new Int16[MAX_LEVELS+1, MAX_PARTITIONS];
                         
                //      Lowest level is always self
                for(Int16 iPartition = 0; iPartition < MAX_PARTITIONS; iPartition++)
                {
                    m_bWinners[MAX_LEVELS, iPartition] = iPartition;
                }
            }

            public void Dispose()
            {
                m_bWinners = null;
            }
           
            public void PrimeEntries(IComparer<_T> comparer)
            {
                m_Comparer = comparer;

                for (Int16 iSlot = 0; iSlot < MAX_PARTITIONS; iSlot++)
                {
                    FTreeInsert(iSlot);
                }
            }

            /*=========================================================
                   0     1            Parents
                0  1  2  3           
               01 23 45 56 ...        Pairs
            =========================================================*/   
            //    Given an index who are we playing this match agianst
            //
            internal Int16 GetMatchTargetIndex(Int16 iPartition)
            {
                Int16 iMatchIndex = iPartition;
               
                iMatchIndex += (0 == iPartition % 2) ? (Int16)1 : (Int16)(-1);

                return iMatchIndex;
            }

            internal Int16 GetParentIndex(Int16 iPartition)
            {
                Int16 iParent;

                iParent = (Int16) (iPartition / 2);

                return iParent;
            }

            internal bool FTreeInsert(Int16 iPartitionIn)
            {
                bool    bRC = true;
                   
                //===========================================================================
                //           
    http://en.wikipedia.org/wiki/Tournament_sort
                //            http://en.wikipedia.org/wiki/Heapsort
                //            http://en.wikipedia.org/wiki/Merge_sort
                //===========================================================================       
                Int16 iPartition = iPartitionIn;
               
                for (Int16 iCurrentLevel = MAX_LEVELS; iCurrentLevel > 0; iCurrentLevel--)
                {
                    Int16 iPartnerPartition = GetMatchTargetIndex(iPartition);
                    Int16 iParent = GetParentIndex(iPartition);

                    Int16 iPartnerCompare = m_bWinners[iCurrentLevel, iPartnerPartition];
                    Int16 iCurrentCompare = m_bWinners[iCurrentLevel, iPartition];

                    Int16 iPrevWinner = m_bWinners[iCurrentLevel - 1, iParent];

                    if (INVALID_PARTITION == iPartnerCompare || false == m_Partitions[iPartnerCompare].FHasData)
                    {
                        m_bWinners[iCurrentLevel - 1, iParent] = iCurrentCompare;
                    }
                    else if (INVALID_PARTITION == iCurrentCompare || false == m_Partitions[iCurrentCompare].FHasData)
                    {
                        m_bWinners[iCurrentLevel - 1, iParent] = iPartnerCompare;
                    }
                    else
                    {
                        if (true == m_bAscending)
                        {
                            if(m_Comparer.Compare(m_Partitions[iCurrentCompare].GetCurrentEntry,  m_Partitions[iPartnerCompare].GetCurrentEntry) < 0)
                            {
                                m_bWinners[iCurrentLevel - 1, iParent] = iCurrentCompare;
                            }
                            else
                            {
                                m_bWinners[iCurrentLevel - 1, iParent] = iPartnerCompare;
                            }
                        }
                        else
                        {
                            if (m_Comparer.Compare(m_Partitions[iCurrentCompare].GetCurrentEntry, m_Partitions[iPartnerCompare].GetCurrentEntry) > 0)
                            {
                                m_bWinners[iCurrentLevel - 1, iParent] = iCurrentCompare;
                            }
                            else
                            {
                                m_bWinners[iCurrentLevel - 1, iParent] = iPartnerCompare;
                            }
                        }
                    }

                    if (iPrevWinner == m_bWinners[iCurrentLevel - 1, iParent] && iPrevWinner != iPartitionIn)        //     I changed so we have to recheck assumptions of previous matches
                    {
                        break;
                    }

                    iPartition = iParent;
                }
               
                return bRC;
            }

            internal bool RemoveHead(out _T entry)
            {
                bool bRC = false;
                Int16 iWinner = m_bWinners[0, 0];

                if (INVALID_PARTITION == iWinner || false == m_Partitions[iWinner].FHasData)
                {
                    ;
                }
                else
                {
                    entry = m_Partitions[iWinner].GetCurrentEntry;

                    m_Partitions[iWinner].FMoveNext();        //  FTreeInsert has to be called in true or false case
                    FTreeInsert(iWinner);

                    bRC = true;
                    return bRC;
                }

                entry = default(_T);
                return bRC;
            }

            internal void ValidateAllProcessed()
            {
                for (int iPartition = 0; iPartition < MAX_PARTITIONS; iPartition++)
                {
                    System.Diagnostics.Debug.Assert( false == m_Partitions[iPartition].FHasData);
                }
            }

            //      Members and Constants
            public const int MAX_PARTITIONS     = 32;           //     Has to be even number to allow matching to work properly - must be power of 2 to handle max levels
            public const int INVALID_PARTITION  = 255;          //     No further entries in the partition
            public const int MAX_LEVELS         = 5;
           
            TDynamicList<_T>[]          m_Partitions;           //  Reference to the partitions for actual values
            IComparer<_T>               m_Comparer;             //  Reference to comparision
            Int16[,]                    m_bWinners;             //  Tournament tree storage
            bool                        m_bAscending;           //  Sort direction
        }
       
        public abstract class ISortReaderWriter<_T>
        {
            abstract public bool FReadEntry(out _T entry, ref int iEnumerator, int iStepping);
            abstract public bool FWriteEntry(_T entry);

            abstract public void ClearReadAhead();
            abstract public void DoReadAhead(int iSourceFileSlot);
        }

        public class TSortManager<_T> : ISortReaderWriter<_T>, IDisposable
        {
            const Int32 OUTPUT_CHIMB = 250000;

            public override void ClearReadAhead()
            {
                throw new NotImplementedException();
            }
            public override void DoReadAhead(int iSourceFileSlot)
            {
                throw new NotImplementedException();
            }

            internal TSortManager(ISortLogOutput oSortOutput)
            {
                m_oSortOutput = oSortOutput;

                m_PartitionedItemStorage = new TDynamicList<_T>[TSortMatchingTree<_T>.MAX_PARTITIONS];

                for (int iSlot = 0; iSlot < TSortMatchingTree<_T>.MAX_PARTITIONS; iSlot++)
                {
                    m_PartitionedItemStorage[iSlot] = new TDynamicList<_T>();
                }

                m_MatchingTree = new TSortMatchingTree<_T>(m_PartitionedItemStorage);
               
                m_i32ActivePartition = 0;
                m_ui64TotalReadEntries = 0;
                m_ui64TotalWriteEntries = 0;

                m_stopWatch = new System.Diagnostics.Stopwatch();
            }

            public void Dispose()
            {
                Dispose(true);
                GC.SuppressFinalize(this);
            }

            protected virtual void Dispose(bool bDisposing)
            {
                if (true == bDisposing)
                {
                    m_MatchingTree.Dispose();
                    m_MatchingTree = null;

                    if (null != m_PartitionedItemStorage)
                    {
                        for (Int32 iSlot = 0; iSlot < TSortMatchingTree<_T>.MAX_PARTITIONS; iSlot++)
                        {
                            m_PartitionedItemStorage[iSlot] = null;
                        }
                    }

                    m_PartitionedItemStorage = null;
                    m_stopWatch = null;
                }
            }

            public void LogOutputMessage(string strMsg)
            {
                m_oSortOutput.LogOutputMessage(strMsg);
            }

            internal int ReadCPS
            {
                get
                {
                    return Math.Min(Math.Max(2, Environment.ProcessorCount - 1), 4);
                }
            }
           
            static int m_iReaderThreadId = -1;
            internal void ReaderThread()
            {
                int iEnumerator = System.Threading.Interlocked.Increment(ref m_iReaderThreadId);
                int iCPUs = ReadCPS;
              
                _T entry;

                while (true == FReadEntry(out entry, ref iEnumerator, iCPUs))
                {
                    Add(entry);

                    System.Int64 i64Read = System.Threading.Interlocked.Increment(ref m_ui64TotalReadEntries);
                    if (0 == i64Read % OUTPUT_CHIMB)
                    {
                        LogOutputMessage(String.Format("     Entries read: {0} : {1} per sec", i64Read, (UInt64)((double)i64Read / Math.Max(1, m_stopWatch.ElapsedMilliseconds / 1000.0))));
                    }
                }
            }

            internal void DoSort(IComparer<_T> comparer)
            {
                _T entry;

                m_stopWatch.Start();

                LogOutputMessage("Loading partitions");

                int iCPUs = ReadCPS;
                System.Threading.Thread[] readThreads = new System.Threading.Thread[iCPUs];

                for (int iThread = 0; iThread < iCPUs; iThread++)
                {
                    readThreads[iThread] = new System.Threading.Thread(ReaderThread);
                    readThreads[iThread].Start();
                }

                foreach (System.Threading.Thread thread in readThreads)
                {
                    thread.Join();
                }

                LogOutputMessage("Reads complete, invoking parallel sort of partitions");
                SetupSortForExternalConsumption(comparer);

                ClearReadAhead();
              
                LogOutputMessage("Gathering partitions and spooling output");

                Int64 ui64StartWriteMS = m_stopWatch.ElapsedMilliseconds;
               
                while(true == RemoveHead(out entry))
                {
                    m_ui64TotalWriteEntries++;
                    if (false == FWriteEntry(entry))
                    {
                        break;
                    }
                   
                    if (0 == m_ui64TotalWriteEntries % OUTPUT_CHIMB)
                    {
                        LogOutputMessage(String.Format("     Entries streamed: {0} : {1} per sec  {2}%", m_ui64TotalWriteEntries,
                           (UInt64)((double)m_ui64TotalWriteEntries / Math.Max(1, (m_stopWatch.ElapsedMilliseconds-ui64StartWriteMS) / 1000.0)),
                           (Int32) (((m_ui64TotalWriteEntries / (double) m_ui64TotalReadEntries) * 100.0))) );
                    }
                }

                m_stopWatch.Stop();

                ClearReadAhead();

                 System.Diagnostics.Debug.Assert(m_ui64TotalReadEntries == m_ui64TotalWriteEntries);
                m_MatchingTree.ValidateAllProcessed();
            }

            override public bool FReadEntry(out _T entry, ref int iEnumerator, int iStepping)
            {
                throw new NotImplementedException();
                //return false;
            }

            override public bool FWriteEntry(_T entry)
            {
                throw new NotImplementedException();
             }

            internal void Add(_T newEntry)
            {
                Int32 iPartition = System.Threading.Interlocked.Increment(ref m_i32ActivePartition) % TSortMatchingTree<_T>.MAX_PARTITIONS;
                m_PartitionedItemStorage[iPartition].Add(newEntry);
            }

            internal void SetupSortForExternalConsumption(IComparer<_T> comparer)
            {
                Parallel.ForEach(m_PartitionedItemStorage, (partitionList) =>
                                    {
                                        partitionList.Sort(comparer);
                                        partitionList.FMoveNext();       //  Position to first entry in list or EOF
                                    }
                    );
               
                 m_MatchingTree.PrimeEntries(comparer);
            }

            internal bool RemoveHead(out _T entry)
            {
                return m_MatchingTree.RemoveHead(out entry);
            }

            internal Int64 Write_Stats
            {
                get { return m_ui64TotalWriteEntries;  }
            }

            internal Int64 Read_Stats
            {
                get { return m_ui64TotalReadEntries; }
            }

            internal System.Diagnostics.Stopwatch StopWatch
            {
                get { return m_stopWatch;  }
            }

            //      Members 
            TDynamicList<_T>[]                  m_PartitionedItemStorage;
            Int32                               m_i32ActivePartition;
            TSortMatchingTree<_T>               m_MatchingTree;
            long                                m_ui64TotalReadEntries;
            long                                m_ui64TotalWriteEntries;
            System.Diagnostics.Stopwatch        m_stopWatch;
            SortManager.ISortLogOutput          m_oSortOutput;
        }
    #endregion
       
    }           //      End if generic sort manager namespace


    namespace MergeXEvents
    {
    #region SortEntry
        public class CXESortEntry
        {
            internal CXESortEntry()
            {
                m_timestamp = default(DateTimeOffset);
                m_ui64EventSequence = 0;       //  0 indicates NOT set
                m_ui64ServerNameHash = 0;
                m_iCookie = 0;
                m_EventLocator = null;         /  This is only unique at a file level so the cookie used to get the proper file is important to making sure we get back to the proper offset

                m_uiUniqueIdentity = (UInt64) System.Threading.Interlocked.Increment(ref s_uiUniqueIdentity);
            }

            internal CXESortEntry(DateTimeOffset timestamp, UInt64 ui64EventSeq, UInt64 uiServerNameHash)
            {
                m_timestamp = timestamp;
                m_ui64EventSequence = ui64EventSeq;
                m_ui64ServerNameHash = uiServerNameHash;
                m_iCookie = 0;
                m_EventLocator = null;

                //      Wanted to use memory address of the entry but with GC moving things we needed a stable value
                //
                //      This is added because the data could have duplicate entries.  For example if you merge files from
                //      2 different servers the event sequence could be repeated (1 from each trace)
                //
                m_uiUniqueIdentity = (UInt64) System.Threading.Interlocked.Increment(ref s_uiUniqueIdentity);
            }

            internal UInt64 EventSequence
            {
                get { return m_ui64EventSequence; }
                set { m_ui64EventSequence = value; }
            }

            internal UInt64 ServerNameHash
            {
                get { return m_ui64ServerNameHash; }
                set { m_ui64ServerNameHash = value; }
            }

            internal DateTimeOffset Timestamp
            {
                get { return m_timestamp; }
                set { m_timestamp = value; }
            }

            internal UInt64 UniqueIdentity
            {
                get { return m_uiUniqueIdentity;  }
            }

            public EventLocator EventLocator
            {
                get { return m_EventLocator; }
                set { m_EventLocator = value; }
            }

            public Int32 Cookie
            {
                get { return m_iCookie;  }
                set { m_iCookie = value;  }
            }

            //      Members
            EventLocator    m_EventLocator;         //  So we can use this to get the event later
            Int32           m_iCookie;              //  External cookie for sort reader and writer logic
            DateTimeOffset  m_timestamp;            //  timestamp on the event
            UInt64          m_ui64EventSequence;    //  event_sequence action
            UInt64          m_ui64ServerNameHash;   //  GetHashCode of the server_principle_name action
            UInt64          m_uiUniqueIdentity;     //  Used to make sure we have a final sort order if timestamps and server names match without event sequence

            static long s_uiUniqueIdentity;
        }
    #endregion

    #region Comparision
        class CXEComparerSeqTimeServer : IComparer<CXESortEntry>
        {
            int IComparer<CXESortEntry>.Compare(CXESortEntry xeLeft, CXESortEntry xeRight)
            {
                Int32 iRC = 0;      //  Note we have UInt64s so we can't do simple match or it can overflow the Int32 return value

                bool bDoTimestamp = false;

                //      Event sequence
                if (0 != xeLeft.EventSequence && 0 != xeRight.EventSequence)
                {
                    if (xeLeft.EventSequence < xeRight.EventSequence)
                        iRC = -1;
                    else if (xeLeft.EventSequence > xeRight.EventSequence)
                        iRC = 1;
                    else
                        bDoTimestamp = true;
                }
                else
                {
                    bDoTimestamp = true;
                }
               
                if (true == bDoTimestamp)
                {
                    if (xeLeft.Timestamp < xeRight.Timestamp)
                        iRC = -1;
                    else if (xeLeft.Timestamp > xeRight.Timestamp)
                        iRC = 1;
                }

                if (0 == iRC)
                {
                    if (0 != xeLeft.ServerNameHash && 0 != xeRight.ServerNameHash)
                    {
                        if (xeLeft.ServerNameHash < xeRight.ServerNameHash)
                            iRC = -1;
                        else if (xeLeft.ServerNameHash > xeRight.ServerNameHash)
                            iRC = 1;
                    }
                }

                if (0 == iRC)
                {
                    if (xeLeft.UniqueIdentity < xeRight.UniqueIdentity)
                        iRC = -1;
                    else if (xeLeft.UniqueIdentity > xeRight.UniqueIdentity)
                        iRC = 1;
                }
               
                return iRC;
            }

        }
    #endregion

    #region StreamToFile
        internal class CSourceFileEntry :  IDisposable
        {
            public CSourceFileEntry(String strFileName)
            {
                m_strXELName = strFileName;
                m_XEventSourceFile = null;
                m_XEEventEnumerator = null;
                #if SEPERATE_OUTPUT
                    m_XEventSourceFileDuplicateForOutput = null;
                    m_DuplicateOpenSemaphore = new Semaphore(0, 1);
                #endif
            }

            public void Dispose()
            {
                m_strXELName = String.Empty;

                if(null != m_XEventSourceFile)
                {
                    m_XEventSourceFile.Dispose();
                    m_XEventSourceFile = null;
                }

                #if SEPERATE_OUTPUT
                    if(null != m_XEventSourceFileDuplicateForOutput)
                    {
                        m_XEventSourceFileDuplicateForOutput.Dispose();
                        m_XEventSourceFileDuplicateForOutput = null;
                    }

                    if (null != m_DuplicateOpenSemaphore)
                    {
                        m_DuplicateOpenSemaphore.Close();
                        m_DuplicateOpenSemaphore = null;
                    }

                #endif
            }

            public String FileName
            {
                get { return m_strXELName;  }
            }

            public QueryableXEventData XELFile
            {
                get { return m_XEventSourceFile;  }
            }

            public IEnumerator<PublishedEvent> EventEnumerator
            {
                get { return m_XEEventEnumerator;  }
            }

            #if SEPERATE_OUTPUT
                public QueryableXEventData XELFileForOutput
                {
                    get { return m_XEventSourceFileDuplicateForOutput; }
                }

                public IEnumerator<PublishedEvent> EventEnumeratorForOutput
                {
                    get { return m_XEEventEnumeratorDuplicateForOutput; }
                }

                public void WaitForOutputOpen()
                {
                    if (true == m_DuplicateOpenSemaphore.WaitOne())
                    {
                        m_DuplicateOpenSemaphore.Release();
                    }
                }
            #endif

            public void Open()
            {
                #if SEPERATE_OUTPUT
                    //  Found if we double read the file into another provider we can pass the published
                    //  event across these providers (same exact metadata and buffers) so we can seperate
                    //  the retreive from serialize and increase the output speed
                    //
                    try
                    {
                        m_XEventSourceFileDuplicateForOutput = new QueryableXEventData(m_strXELName);
                        m_XEEventEnumeratorDuplicateForOutput = m_XEventSourceFileDuplicateForOutput.GetEnumerator();
                    }
                    finally
                    {
                        m_DuplicateOpenSemaphore.Release();
                    }
                #endif

                m_XEventSourceFile = new QueryableXEventData(m_strXELName);
                m_XEEventEnumerator = m_XEventSourceFile.GetEnumerator();
            }

            public bool FIsOpen()
            {
                return null != m_XEEventEnumerator;
            }

            private QueryableXEventData         m_XEventSourceFile;
            private IEnumerator<PublishedEvent> m_XEEventEnumerator;
           
            #if SEPERATE_OUTPUT
                    private QueryableXEventData         m_XEventSourceFileDuplicateForOutput;
                    private IEnumerator<PublishedEvent> m_XEEventEnumeratorDuplicateForOutput;
                    private Semaphore                   m_DuplicateOpenSemaphore;
            #endif
           
            private String                      m_strXELName;
        }

        public class CXEventSortStream : SortManager.TSortManager<CXESortEntry>
        {
            internal CXEventSortStream(SortManager.ISortLogOutput oLogOutput) :
                base(oLogOutput)
            {
                m_XEventSourceFiles = new SortManager.TDynamicArray<CSourceFileEntry>();
            }

            internal void DoSort()
            {
                DoSort(new CXEComparerSeqTimeServer());
            }

            override public bool FReadEntry(out CXESortEntry entry, ref int iEnumerator, int iStepping)
            {
                bool bRC = false;

                PublishedEvent publishedEvent = null;

                do
                {
                    if (iEnumerator < m_XEventSourceFiles.StoredItems)
                    {
                        if(false == m_XEventSourceFiles.Item(iEnumerator).FIsOpen())
                        {
                            m_XEventSourceFiles.Item(iEnumerator).Open();
                        }

                        if (true == m_XEventSourceFiles.Item(iEnumerator).EventEnumerator.MoveNext())
                        {
                            publishedEvent = m_XEventSourceFiles.Item(iEnumerator).EventEnumerator.Current;
                        }
                        else
                        {
                            iEnumerator += iStepping;
                        }
                    }

                } while (null == publishedEvent && iEnumerator < m_XEventSourceFiles.StoredItems);

                if (null != publishedEvent)
                {
                    entry = default(CXESortEntry);
                    entry = new CXESortEntry();

                    entry.EventLocator = publishedEvent.Location;
                    entry.Cookie = iEnumerator;
                    entry.Timestamp = publishedEvent.Timestamp;

                    PublishedAction eventSeq;
                    if (true == publishedEvent.Actions.TryGetValue(@"event_sequence", out eventSeq))
                    {
                        entry.EventSequence = (UInt64) eventSeq.Value;
                    }

                    PublishedAction serverName;
                    if (true == publishedEvent.Actions.TryGetValue(@"server_principal_name", out serverName))
                    {
                        entry.ServerNameHash = (UInt64)serverName.Value.GetHashCode();
                    }

                    bRC = true;
                }
                else
                {
                    entry = default(CXESortEntry);
                }

              
                return bRC;
            }

            override public bool FWriteEntry(CXESortEntry entry)
            {
                return true;
            }

            internal void AddInputFile(string strXELFile)
            {
                CSourceFileEntry se = new CSourceFileEntry(strXELFile);
                lock (m_XEventSourceFiles)
                {
                    m_XEventSourceFiles.Add(se);
                }
            }

            internal SortManager.TDynamicArray<CSourceFileEntry> SourceFiles
            {
                get { return m_XEventSourceFiles; }
            }

            //      Members
            private SortManager.TDynamicArray<CSourceFileEntry> m_XEventSourceFiles;
        }

        class CParallelPayload
        {
            public CParallelPayload(CXEventFileToFileSortManager sortFileMgr, CXESortEntry sortEntry)
            {
                m_sortFileMgr = sortFileMgr;
                m_sortEntry = sortEntry;
                m_publishedEvent = null;
            }

            public CXESortEntry SortEntry
            {
                get { return m_sortEntry; }
            }

            public CXEventFileToFileSortManager SortFileMgr
            {
                get { return m_sortFileMgr; }
            }

            public PublishedEvent PublishedEvent
            {
                get { return m_publishedEvent; }
                set { m_publishedEvent = value;  }
            }

            private CXEventFileToFileSortManager m_sortFileMgr;
            private CXESortEntry m_sortEntry;
            private PublishedEvent m_publishedEvent;
        }

        internal class CScatterGatherStream : ScatterGatherParallelMgr<CParallelPayload>
        {
            public CScatterGatherStream() :
                base(2)
            {
            }

            override public void DoUnOrderedWork(CParallelPayload pPayload)
            {
                pPayload.SortFileMgr.DoUnorderedWork(pPayload);
            }
            override public void DoOrderedWork(CParallelPayload pPayload)
            {
                pPayload.SortFileMgr.DoOrderedWork(pPayload);
            }
           
            override public void OrderedWorkerStalled(CParallelPayload pPayload)
            {
            }

            override public void OrderedWorkerNoStall()
            {
            }

        }


        internal class CXEventFileToFileSortManager : CXEventSortStream
        {
            internal CXEventFileToFileSortManager(SortManager.ISortLogOutput oLogOutput)  :
                base(oLogOutput)
            {
                m_iCancelReadAhead = 0;
                m_ScatterGatherOutputStream = new CScatterGatherStream();
                m_strLastOutputName = String.Empty;
                m_lFileCount = 1;
                m_strOutputFile = String.Empty;
                m_strOutputDir = String.Empty;
                m_XEOutputStream = null;
                m_ui64RolloverCount = 0;

                m_dictReadAhead = new Dictionary<int, Thread>();
            }

            ~CXEventFileToFileSortManager()
            {
                CloseFile();
            }

            new internal void DoSort()
            {
                base.DoSort();
                m_ScatterGatherOutputStream.CompleteAllWork();
                CloseFile();
            }
          
            private void CloseFile()
            {
                if(String.Empty != m_strLastOutputName && null != m_XEOutputStream)
                {
                    m_XEOutputStream.Dispose();
                    m_XEOutputStream = null;
                  
                    string strDirectory = System.IO.Path.GetDirectoryName(m_strLastOutputName);
                    strDirectory = System.IO.Path.GetFullPath(strDirectory);

                    string strFileSpec = System.IO.Path.GetFileName(m_strLastOutputName);

                    //  Strip extention
                    strFileSpec = strFileSpec.Substring(0, strFileSpec.Length - 4);
                    strFileSpec += "*.xel";

                    IEnumerable<string> strFiles = System.IO.Directory.EnumerateFiles(strDirectory, strFileSpec);
                   
                    int iLoops = 0;
                    foreach(string strFile in strFiles)
                    {
                        if (0 != iLoops++)
                        {
                            SetNewOutputFileName();
                            iLoops++;
                        }

                        FileAttributes fa = File.GetAttributes(strFile);
                        if (FileAttributes.Compressed == (FileAttributes.Compressed & fa))
                        {
                            LogOutputMessage(" WARN: Performance may be impacted as output file appears to be using compression.");
                        }

                        System.IO.Directory.Move(strFile, m_strLastOutputName);
                    }
                       
                    m_strLastOutputName = String.Empty;
                }
            }

            internal void SetNewOutputFileName(CXESortEntry entry = null)
            {

                if (null != entry)
                {
                    m_strLastOutputName = String.Format(@"{0}{1}_0 ymd({2}-{3}-{4}) hms({5}-{6}-{7}.{8})_{9}.xel", m_strOutputDir, m_strOutputFile,
                                entry.Timestamp.ToUniversalTime().Year,
                                entry.Timestamp.ToUniversalTime().Month,
                                entry.Timestamp.ToUniversalTime().Day,
                                entry.Timestamp.ToUniversalTime().Hour,
                                entry.Timestamp.ToUniversalTime().Minute,
                                entry.Timestamp.ToUniversalTime().Second,
                                entry.Timestamp.ToUniversalTime().Millisecond,
                                m_lFileCount++);
                }
                else   //   Rollover path
                {
                    string strCount = (m_lFileCount -1).ToString();     //  File count is incremented when the name was built
                    string strFile = m_strLastOutputName.Substring(0, m_strLastOutputName.Length - 4 - strCount.Length);

                    m_strLastOutputName = String.Format(@"{0}{1}.xel", strFile, m_lFileCount++);
                }

                if (true == File.Exists(m_strLastOutputName))
                {
                    FileAttributes fa = File.GetAttributes(m_strLastOutputName);
                    File.SetAttributes(m_strLastOutputName, fa & ~FileAttributes.ReadOnly);
                    System.IO.File.Delete(m_strLastOutputName);
                }
            }

            internal void DoUnorderedWork(CParallelPayload payload)
            {
                lock (SourceFiles.Item(payload.SortEntry.Cookie).XELFile.EventProvider)
                {
                    payload.PublishedEvent = SourceFiles.Item(payload.SortEntry.Cookie).XELFile.EventProvider.RetrieveEvent(payload.SortEntry.EventLocator);
                }
            }


    #if DEBUG
            UInt64 m_ui64SeqCheck = 0;
    #endif
            internal void DoOrderedWork(CParallelPayload payload)
            {
                //========================================================================
                //  Generate file name for output based on the timestamp so it is easy to see which file you want to open.
                //  Toggle close and open the output stream to force a numbered rollover
                //
                //      We use 1 millon because this is the current SSMS display limit
                //
                //      1,000,000 approaches 1GB too often so shrink the number of entries per output file
                //========================================================================
                if (null == m_XEOutputStream || 0 == (m_ui64RolloverCount + 1) % 500000)
                {
                    CloseFile();
                    SetNewOutputFileName(payload.SortEntry);

                    //      Setup the file for output
                    m_XEOutputStream = new XEventFileSerializer(m_strLastOutputName);
                }

    #if DEBUG
                //  Safety check
                if (m_ui64SeqCheck < payload.SortEntry.EventSequence)
                {
                    m_ui64SeqCheck = payload.SortEntry.EventSequence;
                }
                else
                {
                    throw new Exception("UTILITY ERROR; Attempt to stream duplicate entry");
                }
    #endif
                #if SEPERATE_OUTPUT
                    SourceFiles.Item(payload.SortEntry.Cookie).XELFileForOutput.EventProvider.SerializeEvent(m_XEOutputStream, payload.PublishedEvent);
                    m_ui64RolloverCount++;
                #else
                    lock (SourceFiles.Item(payload.SortEntry.Cookie).XELFile.EventProvider)
                    {
                        SourceFiles.Item(payload.SortEntry.Cookie).XELFile.EventProvider.SerializeEvent(m_XEOutputStream, payload.PublishedEvent);
                        m_ui64RolloverCount++;
                    }
    #endif
            }

            override public bool FWriteEntry(CXESortEntry entry)
            {
                CParallelPayload payload = new CParallelPayload(this, entry);
                 
                //      Do inline work
                 #if SEPERATE_OUTPUT
                    m_ScatterGatherOutputStream.FQueueWorkItem(payload);
                #else
                    m_ScatterGatherOutputStream.DoUnOrderedWork(payload);
                    m_ScatterGatherOutputStream.DoOrderedWork(payload);
                #endif
          
                return true;
            }

            public override void ClearReadAhead()
            {
                //  Let everyone know they are done
                Interlocked.Increment(ref m_iCancelReadAhead);

                lock(m_dictReadAhead)
                {
                    //  Make sure all active threads are done
                    foreach(KeyValuePair<int, Thread> oKV in m_dictReadAhead)
                    {
                        oKV.Value.Join();
                    }

                    m_dictReadAhead.Clear();
                }

                //  Reset for next execution
                m_iCancelReadAhead = 0;
            }
            public override void DoReadAhead(int iFileEnumSlot)
            {
                int iCookie = -1;
                Thread tReadAhead = new Thread(DoReadAheadInternal);

                lock (m_dictReadAhead)
                {
                    Thread tReadAheadDummy;
                    if (false == m_dictReadAhead.TryGetValue(iFileEnumSlot, out tReadAheadDummy))
                    {
                        iCookie = iFileEnumSlot;
                        m_dictReadAhead.Add(iFileEnumSlot, tReadAhead);
                    }
                }

                //  Do we need to start the read-ahead to suck data into file system cache
                if (iCookie > -1)
                {
                    tReadAhead.Start(iCookie);
                }
            }

            internal void DoReadAheadInternal(object oParms)
            {
                try
                {
                    int iCookie = (int) oParms;
                    string strName = SourceFiles.Item(iCookie).FileName;
    #if DEBUG
                    LogOutputMessage(String.Format(" INFO: Starting file system cache read-ahead for {0}",  strName));
    #endif
                    //      We just open the file and read in chunks, allowing caching to fill file system cache quickly
                    const int c_iMaxReadSize = ((1024 * 1024) * 4);
                   
                    using (FileStream s = new FileStream(strName, FileMode.Open, FileAccess.Read, FileShare.Read | FileShare.Write | FileShare.Delete, c_iMaxReadSize, FileOptions.SequentialScan))
                    {
                        byte[] bData = new byte[c_iMaxReadSize];

                        ulong ulLoops = 0;
                        int iBytesRead = s.Read(bData, 0, c_iMaxReadSize);
                        while (0 == m_iCancelReadAhead && iBytesRead > 0)
                        {
                            iBytesRead = 0;
                            ulLoops++;
                            if(0 == ulLoops % 100)
                            {
                                Thread.Sleep(10);
                            }

                            iBytesRead = s.Read(bData, 0, c_iMaxReadSize);
                        }
                    }

                    #if SEPERATE_OUTPUT
                        SourceFiles.Item(iCookie).WaitForOutputOpen();
                        if (null != SourceFiles.Item(iCookie).EventEnumeratorForOutput)
                        {
                            while (true == SourceFiles.Item(iCookie).EventEnumeratorForOutput.MoveNext()) ;
                        }
                    #endif
                }
                catch(Exception e)
                {
                    LogOutputMessage(String.Format(" INFO: read ahead encounted exection for {0} ignoring.", e.ToString()));
                }
            }

            internal void SetOutputFile(string strXELFile)
            {
                m_strOutputFile = System.IO.Path.GetFileNameWithoutExtension(strXELFile);
                m_strOutputDir = System.IO.Path.GetDirectoryName(strXELFile);
            }
         
            //      Members
            Dictionary<int, Thread>                                 m_dictReadAhead;
            CScatterGatherStream                                    m_ScatterGatherOutputStream;
            string                                                  m_strOutputFile;
            string                                                  m_strOutputDir;
            XEventFileSerializer                                    m_XEOutputStream;
            string                                                  m_strLastOutputName;
            int                                                     m_lFileCount;
            int                                                     m_iCancelReadAhead;
            UInt64                                                  m_ui64RolloverCount;
        }
    #endregion

        class CSortOutput : SortManager.ISortLogOutput, IDisposable
        {
            public CSortOutput(string strOutputPath)
            {
                m_OutputFile = null;

                string strDirectory = System.IO.Path.GetDirectoryName(strOutputPath);

                Program.FCreateDirectoryWithACLs(strDirectory);

                m_OutputFile = new System.IO.StreamWriter(String.Format(@"{0}MergeEvents.log", strDirectory));
            }

            public void Dispose()
            {
                Dispose(true);
                GC.SuppressFinalize(this);
            }

            protected void Dispose(bool bDisposing)
            {
                if(true == bDisposing)
                {
                    m_OutputFile.Close();
                    m_OutputFile = null;
                }
            }

            override public void LogOutputMessage(string strMsg)
            {
                Program.LogToConsole(strMsg);
                if (null != m_OutputFile)
                {
                    m_OutputFile.WriteLine("{0} {1} {2}", DateTime.Now, System.Threading.Thread.CurrentThread.ManagedThreadId, strMsg);
                    m_OutputFile.Flush();
                }
            }

            private System.IO.StreamWriter m_OutputFile;
        }

        class Program
        {
            static internal void LogToConsole(string strMsg)
            {
                Console.WriteLine("{0} {1} {2}", DateTime.Now, System.Threading.Thread.CurrentThread.ManagedThreadId, strMsg);
            }

            static internal void ShowUsage()
            {
                LogToConsole("");
                LogToConsole("Usage");
                LogToConsole("------------------------------------------------------------------------");
                LogToConsole("/? Present this help information");
                LogToConsole("");
                LogToConsole("EXAMPLE:  MergeXEvents.exe <input spec> <output spec>");
                LogToConsole("          MergeXEvents.exe "C:\temp\XEvent_0_130494871596480000.xel" "C:\temp\Sorted.xel" ");
                LogToConsole("");
                LogToConsole("<input spec> any valid directory and wildcard.   All files matching wildcard in the directory will be processed as input");
                LogToConsole("<output spec> is any valid file name where output for the merge can be stored, including rollover files.");
                LogToConsole("");
            }

    #region AssemblyInformation
            static public string AssemblyTitle
            {
                get
                {
                    object[] attributes = Assembly.GetExecutingAssembly().GetCustomAttributes(typeof(AssemblyTitleAttribute), false);
                    if (attributes.Length > 0)
                    {
                        AssemblyTitleAttribute titleAttribute = (AssemblyTitleAttribute)attributes[0];
                        if (false == String.IsNullOrEmpty(titleAttribute.Title))
                            return titleAttribute.Title;
                    }

                    return System.IO.Path.GetFileNameWithoutExtension(Assembly.GetExecutingAssembly().CodeBase);
                }
            }

            static public string AssemblyVersion
            {
                get
                {
                    return VersionStrings.VERSION_MAJOR_STR + VersionStrings.VERSION_MINOR_STR + VersionStrings.VERSION_BUILD_STR;
                }
            }

           static  public string AssemblyDescription
            {
                get
                {
                    object[] attributes = Assembly.GetExecutingAssembly().GetCustomAttributes(typeof(AssemblyDescriptionAttribute), false);
                    if (attributes.Length == 0)
                        return "";

                    return ((AssemblyDescriptionAttribute)attributes[0]).Description;
                }
            }

            static public string AssemblyProduct
            {
                get
                {
                    object[] attributes = Assembly.GetExecutingAssembly().GetCustomAttributes(typeof(AssemblyProductAttribute), false);
                    if (attributes.Length == 0)
                        return "";
                   
                    return ((AssemblyProductAttribute)attributes[0]).Product;
                }
            }

            static public string AssemblyCopyright
            {
                get
                {
                    object[] attributes = Assembly.GetExecutingAssembly().GetCustomAttributes(typeof(AssemblyCopyrightAttribute), false);
                    if (attributes.Length == 0)
                        return "";

                    return ((AssemblyCopyrightAttribute)attributes[0]).Copyright;
                }
            }

            static public string AssemblyCompany
            {
                get
                {
                    object[] attributes = Assembly.GetExecutingAssembly().GetCustomAttributes(typeof(AssemblyCompanyAttribute), false);
                    if (attributes.Length == 0)
                        return "";

                    return ((AssemblyCompanyAttribute)attributes[0]).Company;
                }
            }
            #endregion

    #region DIR_ACLS
            const int FILE_PERSISTENT_ACLS = 0x00000008;

            [DllImport("kernel32.dll", CharSet = CharSet.Unicode)]
            public static extern bool GetVolumeInformation(string lpRootPathName,
                                                                StringBuilder lpVolumeNameBuffer,
                                                                [MarshalAs(UnmanagedType.U4)] int nVolumeNameSize,
                                                                [MarshalAs(UnmanagedType.U4)] ref int lpVOlumeSerialNumber,
                                                                [MarshalAs(UnmanagedType.U4)] ref int lpMaxComponentLen,
                                                                [MarshalAs(UnmanagedType.U4)] ref int lpFileSytemFlags,
                                                                StringBuilder lpFileSystemNameBuffer,
                                                                [MarshalAs(UnmanagedType.U4)] int nFileSystemNameSize);

            static public bool FCreateDirectoryWithACLs(string strDirName)
            {
                bool bRC = false;

                System.IO.Directory.CreateDirectory(strDirName);
                DirectorySecurity dss = new DirectorySecurity();

                try
                {
                    dss.AddAccessRule(new FileSystemAccessRule(@"NT AUTHORITYANONYMOUS LOGON",
                                                                FileSystemRights.FullControl,
                                                                AccessControlType.Deny));
                }
                catch (Exception e)
                {
                    LogToConsole(String.Format("Exception encountered setting ANONYMOUS access rule {0}", e.ToString()));
                    throw;
                }
           
                try
                {
                    dss.AddAccessRule(new FileSystemAccessRule(System.Security.Principal.WindowsBuiltInRole.Guest.ToString(),
                                                                FileSystemRights.FullControl,
                                                                AccessControlType.Deny));
                }
                catch (Exception e)
                {
                    LogToConsole(String.Format("Exception encountered setting GUEST access rule {0}", e.ToString()));
                    throw;
                }

                try
                {
                    dss.AddAccessRule(new FileSystemAccessRule(@"BuiltInAdministrators",
                                                                FileSystemRights.ReadExtendedAttributes | FileSystemRights.ReadAttributes | FileSystemRights.Synchronize,
                                                                AccessControlType.Allow));
                }
                catch (Exception e)
                {
                    LogToConsole(String.Format("Exception encountered setting Administrators access rule {0}", e.ToString()));
                    throw;
                }

                String strCurrentUser = System.Security.Principal.WindowsIdentity.GetCurrent().Name;

                try
                {
                    dss.AddAccessRule(new FileSystemAccessRule(strCurrentUser,
                                                               FileSystemRights.FullControl,
                                                               AccessControlType.Allow));
                }
                catch (Exception e)
                {
                    LogToConsole(String.Format("Exception encountered setting current user access rule for: " + strCurrentUser, e));
                    throw;
                }

                String strRoot = System.IO.Path.GetPathRoot(strDirName);
                if (false == strRoot.EndsWith("\", StringComparison.Ordinal))
                {
                    strRoot += "\";
                }

                int iSerialNumber = 0;
                int iMaxComponent = 0;
                int iFlags = 0;

                bRC = GetVolumeInformation(strRoot, null, 0, ref iSerialNumber, ref iMaxComponent, ref iFlags, null, 0);
                if (false == bRC)
                {
                    LogToConsole("Attempt to obtain volume information failed.");
                }
                else if (FILE_PERSISTENT_ACLS != (iFlags & FILE_PERSISTENT_ACLS))
                {
                    LogToConsole("ERROR: Specified location does not support proper access controls.");
                    bRC = false;
                }

                if(true == bRC)
                {
                    System.IO.Directory.SetAccessControl(strDirName, dss);
                    bRC = true;
                }

                return bRC;
            }
    #endregion

            //==============================================================
            static int Main(string[] args)
            {
                LogToConsole(AssemblyTitle);
                LogToConsole(String.Format("Version {0}", AssemblyVersion));
                LogToConsole(AssemblyCopyright);
                LogToConsole("------------------------------------------------------------");
                LogToConsole("");
               
                if(args.Length != 2)
                {
                    ShowUsage();
                    return -1;
                }

                return DoWork(args[0], args[1]);
            }

            private static int DoWork(string strInput, string strOutput)
            {
                int iRC = -100;

                try
                {
                    CXEventFileToFileSortManager xeSortMgr = new CXEventFileToFileSortManager(new CSortOutput(strOutput));
                    string strDirectory = System.IO.Path.GetDirectoryName(strInput);
                    strDirectory = System.IO.Path.GetFullPath(strDirectory);
                    string strFileSpec = System.IO.Path.GetFileName(strInput);
                    string[] strParts = strFileSpec.Split('_');

                    strFileSpec = String.Empty;
                   
                    for (int iLoop = 0; iLoop < strParts.Length - 2; iLoop++ )
                    {
                        strFileSpec += strParts[iLoop] + '_';
                    }

                    if(String.Empty != strFileSpec)
                    {
                        strFileSpec += "*.xel";
                    }

                    int iFiles = 0;
                    IEnumerable<string> strFiles = System.IO.Directory.EnumerateFiles(strDirectory, strFileSpec);
                    Parallel.ForEach (strFiles, (strFile) =>
                    {
                        LogToConsole(String.Format("VALIDATING INPUT SOURCE: {0}", strFile));
                    
                        xeSortMgr.AddInputFile(strFile);
                        System.Threading.Interlocked.Increment(ref iFiles);

                        //      Since the XE Linq reader uses file mapping compression defeats the alignment attempts on sector boundaries
                        //
                        FileAttributes fa = File.GetAttributes(strFile);
                        if (FileAttributes.Compressed == (FileAttributes.Compressed & fa))
                        {
                            LogToConsole(" WARN: Performance may be impacted as input file appears to be using compression.");
                        }
                    });

                    xeSortMgr.ClearReadAhead();

                    if(iFiles > 0)
                    {
                        strDirectory = System.IO.Path.GetDirectoryName(strOutput);
                        strDirectory = System.IO.Path.GetFullPath(strDirectory);

                        if (false == FCreateDirectoryWithACLs(strDirectory))
                        {
                            LogToConsole(String.Format("ERROR: Unable to create or set ALCs on {0}", strDirectory));
                            iRC = -6;
                        }
                        else
                        {
                            strFileSpec = System.IO.Path.GetFullPath(strOutput);

                            LogToConsole(String.Format("      OUTPUT: {0}", strFileSpec));
                            {
                                LogToConsole("Removing files in output location with matching file name pattern.");

                                string strFileNameNoExt = System.IO.Path.GetFileNameWithoutExtension(strFileSpec);
                                string strLastOutputNamePattern = String.Format(@"{0}_0 ymd(*) hms(*)_*.xel", strFileNameNoExt);
                                strFiles = System.IO.Directory.EnumerateFiles(strDirectory, strLastOutputNamePattern);

                                foreach(string strFile in strFiles)
                                {
                                    FileAttributes fa = File.GetAttributes(strFile);
                                    File.SetAttributes(strFile, fa & ~FileAttributes.ReadOnly);
                                    System.IO.File.Delete(strFile);
                                }
                            }

                            xeSortMgr.SetOutputFile(strFileSpec);

                            LogToConsole("");
                            LogToConsole("Beginning sort action ...");

                            xeSortMgr.DoSort();
                           
                            LogToConsole(String.Format("Sort action has completed (Elapsed {0} sec)", xeSortMgr.StopWatch.ElapsedMilliseconds / 1000.0));
                            LogToConsole("");

                            LogToConsole(String.Format("    Events read: {0}", xeSortMgr.Read_Stats));
                            LogToConsole(String.Format("Events streamed: {0}", xeSortMgr.Write_Stats));

                            if (xeSortMgr.Read_Stats != xeSortMgr.Write_Stats)
                                iRC = -3;
                            else if (0 == xeSortMgr.Read_Stats)
                                iRC = -4;
                            else
                                iRC = 0;
                        }
                    }
                    else
                    {
                        LogToConsole("ERROR: Could not locate any input files.");
                        iRC = -2;
                    }

                }
                catch (Exception e)
                {
                    LogToConsole("");
                    LogToConsole("Error / Exception encountered");
                    LogToConsole("==========================================================");
                    LogToConsole(e.ToString());
                    iRC = -99;
                }

                return iRC;
            }

        }
    }

    ScatterGather.cs

    using System;
    using System.Collections.Concurrent;
    using System.Collections;
    using System.Threading;

    namespace ScatterGatherParallelNS
    {
        #region ScatterGatherParallelMgr_IMPL
        public abstract class ScatterGatherParallelMgr<_TPayload> : IDisposable
        {
            private BlockingCollection<CMsg<_TPayload>> m_FreeQueue;        //  Msg has been processed, it can handle other payload
            private BlockingCollection<CMsg<_TPayload>> m_UnOrderedQueue;   //  Msg is dirty and needs processed when on this queue
            private BlockingCollection<CMsg<_TPayload>> m_OrderedQueue;     //  Msg is ready for final processing when on this queue
            private ArrayList m_alUnOrderedThreads;
            private Thread m_OrderedThread;
            private bool m_bInitialized;
            private Int32 m_TotalUnOrderedThreads;
            private Int32 m_TotalMessages;
            private bool m_bOrderedIsCurrentlyStalled;
            private Int64 m_iTotalOrderStalls;

            public ScatterGatherParallelMgr(int iUnorderedLimit)
            {
                m_TotalUnOrderedThreads = Math.Max(2, Environment.ProcessorCount-1);
                m_TotalUnOrderedThreads = Math.Min(iUnorderedLimit, m_TotalUnOrderedThreads);
                m_TotalMessages = m_TotalUnOrderedThreads * 128;           //  ## per CPU
                m_bInitialized = false;
                m_FreeQueue = null;
                m_UnOrderedQueue = null;
                m_OrderedQueue = null;
                m_alUnOrderedThreads = null;
                m_OrderedThread = null;
                m_bOrderedIsCurrentlyStalled = false;
                m_iTotalOrderStalls = 0;
            }

            public bool FIsOrderedWorkerStalled
            {
                get { return m_bOrderedIsCurrentlyStalled; }
            }

            public Int64 TotalOrderedWorkerStalls
            {
                get { return m_iTotalOrderStalls; }
            }

            virtual public void AbortAllWorkers()
            {
                if (null != m_alUnOrderedThreads)
                {
                    foreach (Thread t in m_alUnOrderedThreads)
                    {
                        try
                        {
                            t.Abort();
                        }
                        catch (Exception)
                        {
                        }
                    }
                }

                if (null != m_OrderedThread)
                {
                    try
                    {
                        m_OrderedThread.Abort();
                    }
                    catch (Exception)
                    {
                    }
                }
            }

            private void StartWorkersIfNecessary()
            {
                if (false == m_bInitialized)
                {
                    m_bOrderedIsCurrentlyStalled = false;
                    m_iTotalOrderStalls = 0;

                    m_FreeQueue = new BlockingCollection<CMsg<_TPayload>>();            //  Order not required
                    m_UnOrderedQueue = new BlockingCollection<CMsg<_TPayload>>();       //  Order not required

                    ConcurrentQueue<CMsg<_TPayload>> q = new ConcurrentQueue<CMsg<_TPayload>>();  //  Order required
                    m_OrderedQueue = new BlockingCollection<CMsg<_TPayload>>(q);

                    for (Int32 iMsg = 0; iMsg < m_TotalMessages; iMsg++)
                    {
                        m_FreeQueue.Add(new CMsg<_TPayload>());
                    }

                    m_alUnOrderedThreads = new ArrayList();

                    for (Int32 iUnOrdered = 0; iUnOrdered < m_TotalUnOrderedThreads; iUnOrdered++)
                    {
                        ThreadStart ts = new ThreadStart(UnOrderedEntryPoint);
                        Thread t = new Thread(ts);
                        m_alUnOrderedThreads.Add(t);
                        t.Start();
                    }

                    {
                        ThreadStart ts = new ThreadStart(OrderedEntryPoint);
                        m_OrderedThread = new Thread(ts);
                        m_OrderedThread.Start();
                    }

                    m_bInitialized = true;
                }
            }

            protected virtual void Dispose(bool bDispose)
            {
                CompleteAllWork();  //  If not shutdown do so now
            }

            [System.Diagnostics.CodeAnalysis.SuppressMessage("Microsoft.Design", "CA1063")]
            public virtual void Dispose()
            {
                Dispose(true);
                GC.SuppressFinalize(this);
            }

            public void CompleteAllWork()
            {
                if(null != m_UnOrderedQueue)
                {
                    m_UnOrderedQueue.CompleteAdding();
                }

                if (null != m_alUnOrderedThreads)
                {
                    foreach (Thread t in m_alUnOrderedThreads)
                    {
                        t.Join();
                    }

                    m_alUnOrderedThreads = null;
                }

                if (null != m_OrderedQueue)
                {
                    m_OrderedQueue.CompleteAdding();
                }

                if (null != m_OrderedThread && null != m_OrderedQueue)
                {
                    m_OrderedThread.Join();
                    m_OrderedThread = null;
                }

                if(null != m_FreeQueue)
                {
                    m_FreeQueue = null;
                }

                if(null != m_UnOrderedQueue)
                {
                    m_UnOrderedQueue = null;
                }

                if(null != m_OrderedQueue)
                {
                    m_OrderedQueue = null;
                }

                m_bInitialized = false;
            }

            public abstract void DoUnOrderedWork(_TPayload pPayload);

            private void UnOrderedEntryPoint()
            {
                try
                {
                    foreach (CMsg<_TPayload> cMsg in m_UnOrderedQueue.GetConsumingEnumerable())
                    {
                        DoUnOrderedWork(cMsg.Payload);
                        cMsg.MarkReadyForOrderedProcessing();
                    }
                }
                catch (ThreadAbortException)        //  Silent, main would have issued this
                {
                }
                catch (OperationCanceledException)
                {
                }
            }

            public abstract void DoOrderedWork(_TPayload pPayload);
            public abstract void OrderedWorkerStalled(_TPayload pPayload);
            public abstract void OrderedWorkerNoStall();

            private void OrderedEntryPoint()
            {
                Int32 iContigiousNoStalls   =   0;

                try
                {
                    foreach (CMsg<_TPayload> cMsg in m_OrderedQueue.GetConsumingEnumerable())
                    {
                        bool bAcquired = false;

                        bAcquired = cMsg.WaitForUnOrderedProcessing(0);
                        if (false == bAcquired)        //  Immediate grab attempt
                        {
                            m_iTotalOrderStalls++;
                            iContigiousNoStalls = 0;
                            m_bOrderedIsCurrentlyStalled = true;
                            OrderedWorkerStalled(cMsg.Payload); //  Signal we are going to a wait state on the specific payload
                        }
                        else
                        {
                            if (0 == (++iContigiousNoStalls % (m_TotalMessages * 0.25)))
                            {
                                OrderedWorkerNoStall(); //  Signal that prediction could feed us more
                                m_bOrderedIsCurrentlyStalled = false;
                            }
                        }

                        if (false == bAcquired)
                        {
                            cMsg.WaitForUnOrderedProcessing(-1);
                        }

                        DoOrderedWork(cMsg.Payload);

                        cMsg.MakeReadyForFreeQueue();
                        m_FreeQueue.Add(cMsg);
                    }
                }
                catch (ThreadAbortException)        //  Silent, main would have issued this
                {
                }
                catch (OperationCanceledException)
                {
                }
            }

            public bool FQueueWorkItem(_TPayload pPayload)
            {
                bool bRC = false;
                StartWorkersIfNecessary();

                if (true == m_bInitialized)
                {
                    CMsg<_TPayload> cMsg = m_FreeQueue.Take();
                    cMsg.Payload = pPayload;

                    //  Place on both queues
                    m_UnOrderedQueue.Add(cMsg);
                    m_OrderedQueue.Add(cMsg);
                    bRC = true;
                }

                return bRC;
            }

            #region CMsg_IMPL
            public sealed class CMsg<_TPayloadType> : IDisposable
            {
                public CMsg()
                {
                    m_Payload = default(_TPayloadType);
                    m_OrderedReadyLock = new AutoResetEvent(false);
                    MakeReadyForFreeQueue();
                }

                public void Dispose()
                {
                    if (null != m_OrderedReadyLock)
                    {
                        m_OrderedReadyLock.Close();
                        m_OrderedReadyLock.Dispose();
                        m_OrderedReadyLock = null;
                    }

                    GC.SuppressFinalize(this);
                }

                public void MakeReadyForFreeQueue()
                {
                    m_Payload = default(_TPayloadType);
                }

                private _TPayloadType m_Payload;
                public _TPayloadType Payload
                {
                    get { return m_Payload; }
                    set { m_Payload = value; }
                }

                private AutoResetEvent m_OrderedReadyLock;
                public void MarkReadyForOrderedProcessing()
                {
                    m_OrderedReadyLock.Set();
                }

                public bool WaitForUnOrderedProcessing(Int32 iTimeout)
                {
                    return m_OrderedReadyLock.WaitOne(iTimeout);
                }
            }
            #endregion

        }
        #endregion

    }

    Powershell Test Driver

    $core =   [Reflection.Assembly]::LoadFrom('Microsoft.SqlServer.XE.Core.dll');
    $xelinq = [Reflection.Assembly]::LoadFrom('Microsoft.SqlServer.XEvent.Linq.dll');

    $MergeXeventAssembly = [Reflection.Assembly]::LoadFrom('.binx64ReleaseMergeXEvents.exe');
    if($null -eq $MergeXeventAssembly)
    {
        throw 'ERROR: Unable to load assembly';
    }

    $bDumpRaw = $true;

    if($true -eq $bDumpRaw)
    {
        [MergeXEvents.Exporter]::GetIList('C:tempNewTestTracesBigXELTest_0_130494871596480000.xel', 'C:tempNewTestTracesBigXEL');
    }
    else
    {
            $iRow = 0;

            $events = [MergeXEvents.Exporter]::GetIList('C:tempNewTestTracesBigXELeEst_0_130494871596480000.xel', 'C:tempNewTestTracesBigXEL');
            foreach($event in $events)
            {
                 $iRow.ToString() + ' Event: ' + $event.Event_name + '|Package: ' + $event.Event_package_name + '|Timestamp: ' + $event.Event_timestamp  | format-table -auto;
                 $iRow++;
            }
    }

    Bob Dorr - Principal Software Engineer SQL Server


    Why does the elevation prompt have only the wallpaper as its background?

    $
    0
    0


    One small change to the elevation interface in Windows 8
    has to do with the image behind the elevation prompt.
    In earlier versions of Windows,
    the image was a snapshot of your desktop,
    including all your open windows.
    But in Windows 8,
    it's just a picture of your wallpaper.
    Why did this change?



    We discovered that
    when you ran an elevated application,
    the screen capture performed by the elevation prompt often looked ugly
    because there was a good chance it caught an animation mid-stream.
    For example, when you run an application
    elevated from the Windows 8 Start page,
    the tiles zoom out, your name in the upper right corner fades out,
    and the entire screen cross-fades to the desktop.
    As a result, the snapshot showed your tiles in some intermediate
    location, a faint image of your name in the corner,
    and a half-visible desktop.



    Okay, so work could've been done to, say, wait for all animations
    to complete before taking the screen capture.
    That and other solutions were proposed and considered,
    but the feature simply didn't make the cut.
    There were too many other things that had to be done,
    and those other things took higher priority.



    Engineering is about trade-offs,
    and trade-offs are rarely easy.
    If you decide to do everything,
    then you find yourself trying to cram two years of work
    into one month of schedule,
    and that rarely ends well.
    You have to prioritize what is important and what is less important,
    and then exercise your creativity to come up with
    alternate solutions which, perhaps not ideal,
    get the job done in less time and with lower risk.

    Free Agile Productivity Courses

    $
    0
    0

    Agility simply put is “responding to change.” 

    When you know, “what got you here, won’t get you there”, and you change your approach, that’s agility in action.

    Agile productivity is about getting better results by learning how to adapt and respond to the changing world around you.

    By adopting a growth-mindset, playing to your strengths, and taking a whole-person approach to productivity you change your game.

    By creating effective feedback loops, and using your learning to change your approach, you raise your game.

    That sounds easy, but the implementation can be challenging.

    Fear not, simply follow a few free courses that will put all we know about getting better, faster, easier, more meaningful results on your side …

    7 Days of Agile Results (Free Course)

    This is 7 days of self-paced mini-lessons where you learn how to implement a simple habit for high-performance – the Monday Vision, Daily Wins, Friday reflection pattern from Getting Results the Agile Way.

    Here’s what you will learn over 7 days:

    1. Proven practices to master time management, motivation, and personal productivity
    2. Discover the one way to stack the deck in your favor that’s authentic and works
    3. How to embrace change and get better results in any situation
    4. How to focus and direct your attention with skill
    5. How to use your strengths to create a powerful edge for getting results
    6. How to change a habit and make it stick
    7. How to never miss the things that matter most, and achieve better work-life balance
    8. How to spend more time doing the things you love

    Start your 7 Days of Agile Results (Free Course).

    30 Days of Getting Results (Free Course)

    I find that picking a theme for the month, and focusing on daily insights or improvement is a great way to make breakthroughs.

    This is 30 days of self-paced productivity lessons that could very well lead to your greatest breakthroughs, ever.

    Here’s what you will learn over 30 days:

    1. How to double or triple your personal productivity (if not more)
    2. How to use your strengths to amplify your impact
    3. How to motivate yourself with skill and find your drive
    4. How to change a habit and make it stick
    5. How to make the most of your your moments, days, weeks, months, and years
    6. How to use a simple system to achieve meaningful results
    7. How to master the art and science of work-life balance
    8. How to focus and direct your attention with skill
    9. How to master time management
    10. How to spend more time on the things that really matter to you
    11. How to write your story forward

    Start your 30 Days of Getting Results (Free Course).

    From Windows 10 Eye Control and the Xbox Adaptive Controller, to Language Understanding and Custom Image Reco – What a journey!

    $
    0
    0

    This article lists content describing how a demonstration game leveraged many accessibility-related features of Windows, and incorporated a variety of Azure Cognitive Services.

    Introduction

    Around this time last year, I began an experiment into how a Microsoft Store app might leverage many accessibility-related features of Windows. By creating a demo solitaire game called Sa11ytaire, I, with the help of my colleague Tim, showed how straightforward it can be to build an app which can be used with many input and output mechanisms. Our goal was to encourage all app builders to consider how their own apps can be efficiently used by as many people as possible. I published my first articles describing the technical steps for building the app at Microsoft Windows UI Automation Blog.

    Since then I contained experimenting, with the goal of demonstrating how Azure Cognitive Services might help to enable exciting new scenarios in apps. I really found it fascinating to explore how such things as speech to text, image recognition, language understanding and bots might help a game become more useable to more people. I continued documenting the technical aspects of updating the demo game to incorporate use of Azure Cognitive Services, at my own LinkedIn page, Barker's articles.

    Below I've included links to the full set of twelve Sa11ytaire articles. Or rather, I've included links to what's currently the full set. The Sa11ytaire Experiment never ends.

    All the best for 2019 everyone!

    Guy

     

    The articles

    1. The Sa11ytaire Experiment: Part 1 – Setting the Scene

    Figure 1: The Windows Eye Control UI showing over the Sa11ytaire app, and being used to move a 10 of Clubs over to a Jack of Hearts.

     

    2. The Sa11ytaire Experiment – Enhancing the UIA representation

    Figure 2: The Inspect SDK tool reporting the UIA properties of an upturned card element, with properties relating to Name, ControlType, HelpText, and the Toggle pattern highlighted.

     

    3. Sa11ytaire on Xbox: Let the experiment begin!

    Figure 3: The Sallytaire game being played on the Xbox, with a switch device plugged into an Adaptive Controller. A ten of diamonds is being moved onto a jack of clubs.

     

    4. The Sa11ytaire Experiment – Reacting to Feedback Part 1: Keyboard efficiency

    Figure 4: The Sa11ytaire app running on a Surface Book with an external number pad connected.

     

    5. The Sa11ytaire Experiment – Reacting to Feedback Part 2: Localization and Visuals

    Figure 5: The high contrast Sa11ytaire app being played with a footswitch plugged into a Bluetooth-connected Xbox Adaptive Controller.

     

    6. The Sa11ytaire Experiment - End of Part 1

    Figure 6: The Sa11ytaire app running on a Surface Book, surrounded by a variety of input devices.

     

    7. The Sa11ytaire Experiment Part 2: Putting the AI in Sa11ytAIre

    Figure 7: The Sa11ytaire app showing the results of the Azure Language Understanding service. An utterance of "Grab the 4 of clubs and put it on the 3 of clubs" was matched with an intent of "MoveCard" with a confidence of 0.99965173.

     

    8. Using Azure Custom Vision services to play Sa11ytaire by presenting a playing card to the game

    Figure 8: The physical 2 of Diamonds card being held up in front of a Surface Book running the Sa11ytaire app.

     

    9. Delivering a more helpful game experience through Azure Language Understanding, Q&A, and Search

    Figure 9: The Sa11ytaire app showing the result of a Bing Web Search in response to the player asking the question "Where can I buy Braille playing cards?" at the app.

     

    10. Adding an Azure Bot to a game to enable a more natural experience

    Figure 10: The Sa11ytaire app hosting the Sa11y bot using the Azure Bot Service's web chat UI

     

    11. Interacting with an Azure bot directly from a Windows Store app

    Figure 11: The Sa11ytaire app interacting directly with the Sa11y bot, and using Speech to Text and Text to Speech to interact with the player.

     

    12. The Sa11ytaire Experiment - End of Part 2

    Figure 12: The Sa11ytaire app running on a Surface book, showing Sa11y the bot. By the device are playing cards and a Dalek wearing headphones.

    Lesson Learned #52: Azure Database for MySQL – Server is not configured to allow ipv6 connections

    $
    0
    0

    Hello,

    Several days ago, I worked in a service request that our customer had configured all elements ( VPN, Site-to-Site, etc…) but they are getting an error: ERROR 9009 (28000): Server is not configured to allow ipv6 connections trying to connect using service endpoint.

    First of all, I have to mention that currently, Azure Database for MySQL doesn't support IPV6.

    • Our customer had this scenario:
      • All the machines created in the virtual network plus subnets didn’t have a public IP.
      • Our customer had an incorrect endpoint configuration for Azure Database for MySQL -
      • The NGS blocked all internet connection to the gateway.
    • Root Cause:
      • Created the same scenario for our customer I was able to reproduce the issue.
      • As we have said I found that when Linux machine tries to connect to the Azure Database for MySQL server our customer got the error message: ERROR 9009 (28000): Server was not configured to allow ipv6 connections because the source IP corresponds to an internal IP and not public IP. When Azure Database for MySQL endpoint was not configured, it received an incorrect IP that it seems that works as IPV6.
      • Now configured the VNET and SubNets our customer has been able to connect without problems.

    Enjoy!

    How can I prevent a WebView control from opening a browser window?

    $
    0
    0


    A customer had an application that used a UWP
    WebView
    control.
    Some Web sites
    open links in a new window by using techniques like
    TARGET=_blank.
    When the user clicks on such a link, it opens in a Web browser.
    The customer wanted to know how to prevent this.



    To do this, you can handle the

    New­Window­Requested
    event.
    You can mark the event as Handled,
    in which case the system will consider the action complete
    and will not send the request to the user's default Web browser.




    <!-- XAML -->
    <WebView NewWindowRequested="OnNewWindowRequested" />

    // C# code-behind
    void OnNewWindowRequested(WebView sender, WebViewNewWindowRequestedEventArgs e)
    {
    // Block all requests to open a new window
    e.Handled = true;
    }



    You can inspect the Referrer and Uri
    properties to learn more about what triggered the new window.



    • Referrer is the page that wants to open the window.

    • Uri is the page that it wants to open.



    If your handler is a coroutine,
    then you must set Handled = true
    before performing any await operations,
    because the handler returns to its caller
    as soon as you perform an await,
    and the rest of the handler runs as an asynchronous task.

    Viewing all 35736 articles
    Browse latest View live


    <script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>