DevReach Conference 2018

December 6th, 2018 | Posted by Robert Curlette in .NET | Conference - (0 Comments)

Being a Developer means we must constantly update our knowledge with the latest tools and techniques, frameworks and cloud options. One of the best ways to do that is attend a developer conference, where we can get a lot of information in a few days, and also network with fellow developers.

This year I was happy to return to DevReach in Sofia, an event organized by Progress Software (in the past, Telerik) and focussed on the Microsoft .Net technology stack. The first time I attended was in 2013 at a local cinema in Sofia, and I was impressed by the quality of the organization and high quality of talks an speakers. The event was one of the inspirations for me when I created the Tridion Developer Summit conference. This year did not disappoint! We were back to the cinema, with a very full agenda with several tracks and high quality speakers. The speakers were all industry experts, mostly from Microsoft or Progress software, and were very familiar with the content they spoke about.

A full list of sessions is available at https://devreach.com/sessions/

They also have a Facebook page with photos of the event.  https://www.facebook.com/DevReach/

All talks were recorded and will show up on the Progress YouTube channel here:  https://www.youtube.com/channel/UCwr0eQsblxgpjkUXbiCjrRA

The agenda was littered with talks on AI and Machine Learning, Serverless and Docker, and also cloud and Azure. We had a few sessions about .Net, and these were well attended, with people standing in the aisles, or sitting on the floor in front, all with their eyes glued to the code on the screen, and listening to the speakers unraveling the code on the screen in front of us. The talks were very well attended and I had to sit on the floor a couple of times to attend a talk. A good sign, if you ask me!

One of my favorite talks at the event was the Docker presentation by Chris Klug (https://twitter.com/ZeroKoll). He not only knew his content, he presented it in a very funny and interesting way. He walked us through why to use Docker, and how he uses it to run .Net Core in an isolated and trusted way on his laptop. Slides were at a minimum and he spent lots of time at the command prompt, issuing commands like a true console ninja, and wow’ing us in the process. He ran over on time, but he still wanted to show his last demo, so he fired off a few commands at the command prompt, hit return, waited 10 seconds, and it all worked…or it seemed to.

Another nice talk was the session after his was by Tibi Covaci (from Codemag) on the state of .Net – what is happening in the Core and with the latest version of the .Net Framework 4.6. This was a really nice summary and the presenter knew his content really well.

I also attended a few AI talks, which mostly focussed on the business case and were an introductory level, showing code only within Powerpoint. I hope next year we see more live demos and live coding, and also more deep talks.

On day 2 I was really looking forward to the Sitefinity Headless CMS, since I have a background in Content Management and wanted to see how Sitefinity would handle this new trend. After some basic concepts were explained, we were shown the OData response in the browser, and then shown a WebApp and Mobile App that consumed the CMS data from the OData service. I have used OData in the past and not been too impressed with it. Seems to get difficult when filtering results, or doing more advanced queries of the dataset. One of the participants made some comments about exactly this, and then said he had later figured it out. But, unfortunately the talk was very short, using less than half of the time for the talk, and didn’t seem well prepared. I would have liked to see more code, more behind the scenes of how he consumed the data, and in general more discussion around the idea of headless versus standard.

After this talk I listened in on a .Net Rocks live show, always funny and a pleasure to listen to the topics discussed and how they move through so many different ideas. The discussion went to Serverless and tooling, with one of the guests speaking about VIM and her use of the editor. Others spoke about VS Code, and how helpful the debugger is in it.

Finally, the day wrapped up with a very inspirational talk from Neal (reverentgeek), titled ‘You Are Awesome!’. He comes from my home State of Georgia, and it was nice to hear a familiar accent. 🙂 Very inspiring, authentic and from the heart, Neal shared with us that we can all get better at something by putting in the time and dedication to the task, and continue practicing. For example, a few years ago Neal discovered he likes to draw, made some sketches for a conference talk of a man, and then 3 years later made the same sketch. The level of detail and quality had improved a lot! That reminded me that we need to practice a bit, every day, and to be nice to ourselves and allow us to fail sometimes, but keep moving forward. Big thanks to Neal for sharing his honest story.

We were all invited to a 10 year celebration party at a nearby hotel, where a famous Bulgarian rock band performed while we networked with the speakers and other attendees.

Overall, the event was very well organized and offered a lot to the participants. Big thanks to Progress for hosting the event! I hope they are able to continue providing such a high quality Developer event to the Balkan region.

DXA 2 promises to publish less verbose JSON, sending smaller JSON items into the Publishing Queue and therefore speeding up our publish times.  It will also consume less space in the Broker DB.  So they say!  But, what does the new and old JSON look like and is it much lighter?  In this article I’ll highlight only the diffs between the rendering of a Keyword field JSON and Textfield JSON.  And, you might be wondering why I even care (aside from curiosity), but I’ve built a TBB that amends the Published DXA JSON and injects Structure Group Metadata fields masquerading as Page Metadata fields (in the JSON) and therefore available to our DXA Frontend WebApp.  OK, so here it is:

DXA 1.7 Keyword Field

"people": {
 "Name": "people",
 "Values": [ "Public and Member Communications", "Public Interest people", "Publications and Databases" ],
 "NumericValues": [ ],
 "DateTimeValues": [ ],
 "LinkedComponentValues": [ ],
 "FieldType": 3,
 "CategoryName": "people List",
 "CategoryId": "tcm:11-11393-512",
 "XPath": "Metadata/custom:people",
 "KeywordValues": [
 {
 "IsRoot": false,
 "IsAbstract": false,
 "Description": "",
 "Key": "",
 "TaxonomyId": "tcm:11-11393-512",
 "Path": "\\people List\\Public and Member Communications",
 "RelatedKeywords": [ ],
 "ParentKeywords": [ ],
 "MetadataFields": { },
 "Id": "tcm:11-106852-1024",
 "Title": "Public and Member Communications"
 },
 {
 "IsRoot": false,
 "IsAbstract": false,
 "Description": "",
 "Key": "",
 "TaxonomyId": "tcm:11-11393-512",
 "Path": "\\people List\\Public Interest people",
 "RelatedKeywords": [ ],
 "ParentKeywords": [ ],
 "MetadataFields": { },
 "Id": "tcm:11-106848-1024",
 "Title": "Public Interest people"
 },
 {
 "IsRoot": false,
 "IsAbstract": false,
 "Description": "",
 "Key": "",
 "TaxonomyId": "tcm:11-11393-512",
 "Path": "\\people List\\Publications and Databases",
 "RelatedKeywords": [ ],
 "ParentKeywords": [ ],
 "MetadataFields": { },
 "Id": "tcm:11-106853-1024",
 "Title": "Publications and Databases"
 }
 ]
}

DXA 2.0 Keyword Field

"people": {
 "$type": "KeywordModelData[]",
 "$values": [
 {
 "Id": "106852"
 },
 {
 "Id": "106848"
 },
 {
 "Id": "106853"
 }
 ]
},

DXA 1.7 Text Field

"language": {
 "Name": "language",
 "Values": [ "English" ],
 "NumericValues": [ ],
 "DateTimeValues": [ ],
 "LinkedComponentValues": [ ],
 "FieldType": 0,
 "XPath": "Metadata/custom:language",
 "KeywordValues": [ ]
},

DXA 2.0 Text Field

"language": "English",

Summary

So, there we have it, the new DXA 2.0 JSON delivers what it promises – much leaner and meaner JSON for the benefit of us all.

During a recent DXA project we experienced strange errors when publishing some content.  We are moving to DXA and the new DXA templates try to render every field of the Component and linked-to Components.  This is usually great, but if your content is a bit stale or outdated, or possibly contains an invalid value (like mine did) then the following script might be helpful.  (You should create a normal Core Service app, mine is a Console App, and reference all the usual Tridion Assemblies).

It was complaining that a Category did not contain the Keyword.  The error message was ‘Keyword with title ‘Long Lost Keyword’ does not exist in Category ‘Amazing Category’ [tcm:7-12345-512].’ After some digging we found a Multimedia Component with invalid Metadata.   To view the invalid Metadata we used the excellent Tridion Alchemy Plugin Show Item XML from the team at Content Bloom.  If you haven’t yet tried Alchemy yet, now is the perfect time, and this plugin alone makes it worth the (free) install.  You can even just install it on your Dev server if you want.

The simple solution for our invalid metadata problem would be to change it in the GUI, but the GUI didn’t show the value, and we were stuck.  So, we decided to write a small Core Service script that updates the Metadata field or Removes it.  Hope this helps.

 

Currently as part of the Publish process, in a post-build event I am sending JSON to an external search engine.  As part of that process, I wait for a response from the search engine that the content arrived successfully.

However, when it doesn’t arrive, I wold like to notify authors via the PublishQueue status that it didn’t get there.  One solution is to update the Publish Status on the published item, setting it to Warning or Failed, and also update the text in the Publish Transaction.  The code below shows how we can do this.  I implemented it as a WebService, so it is possible to be called from any external system, including the event system.

 public void Get(string transactionUri)
 {
      string uri = "tcm:" + transactionUri;
      string binding = "netTcp_201501";
      SessionAwareCoreServiceClient client = new SessionAwareCoreServiceClient(binding);
      PublishTransactionData trans = client.Read(uri, new ReadOptions()) as PublishTransactionData;
      string title = trans.Title;
      trans.State = PublishTransactionState.Warning;
      trans.Information = "Didn't make it to Search Index - please publish again";
      client.Update(trans, new ReadOptions());
 }

 

Here are some helpful tips when working with JSON.NET.  It is sometimes challenging to find the right method or property to use because we don’t have access to intellisense when using the ‘dynamic’ object type often preferred by JSON.NET.Here are some helpful tips when working with JSON.NET.  It is sometimes challenging to find the right method or property to use because we don’t have access to intellisense when using the ‘dynamic’ object type often preferred by JSON.NET.
The context is that I get an existing JSON document, and I need to add new properties and nodes into the document.

Hope this was of some help to you with providing some more examples of the excellent JSON.NET library.  If you have any more tips or suggestions for the code I am happy to hear them.  Thanks.

While creating some JSON for the Metadata on a Structure Group in C# I came across an interesting challenge to display the URI and Title of the Embedded Schema. Initially I was using a FieldDefinition instead of an EmbeddedFieldDefinition and then the ID and Title properties were not available of the Embedded Field. Casting the variable to an EmbeddedSchemaFieldDefinition was the proper way to access the EmbeddedField definition info.

 

EmbeddedSchemaField right1Field = pageMetaFields[fieldName] as EmbeddedSchemaField;
EmbeddedSchemaFieldDefinition def = right1Field.Definition as EmbeddedSchemaFieldDefinition;

embeddedSchemaFieldDef.RootElementName = def.EmbeddedSchema.RootElementName; //"Content";
embeddedSchemaFieldDef.Id = def.EmbeddedSchema.Id; // "tcm:11-123-8";
embeddedSchemaFieldDef.Title = def.EmbeddedSchema.Title; // "Metadata fieldname";

Amending DXA JSON

January 18th, 2018 | Posted by Robert Curlette in DXA - (0 Comments)

Sometimes you may wish to add additional content into the default DXA JSON content that is published to the Broker that are not part of the default Component Fields. In my situation I would like to have fields from the Page and Structure Group Metadata available in my DXA view. While using DXA 1.7 there is no out of the box, or ‘accelerated’, way to do this. My idea is to create an additional C# TBB, and in that access the DXA JSON and also add a field to it. The advantage to this approach is that the additional field will be seen by the DXA runtime as a Page Metadata field, and therefore serialized and available to us in the View without doing anything special on the DXA Frontend Webapp.

In this post I will share a sample app I used to access the DXA JSON and add an additional property into the MetadataFields collection. It was quite tricky to get this code to work, as we need to use the ‘dynamic’ type in C#, and this is without intellisense, so finding the appropriate methods and properties to use was a bit of a challenge. You will need to add the Newtonsoft JSON library from Nuget to get the code to work.

In a future article I will share more about the C# TBB that uses this code. The dxa.json file is the output of the Generate Dynamic Page (DXA) TBB form Template Builder. Also, in the code below, I only show an example for a text field, and I also do not product the JSON for XPM – as I have no intention of using XPM on these ‘fake’ Page Metadata fields.

using System.Collections.Generic;
using Newtonsoft.Json.Linq;
using System;

namespace temp1
{
    class Program
    {
        static void Main(string[] args)
        {
            string text = System.IO.File.ReadAllText(@"C:\RC\dxa.json");
            dynamic jsonObject = JObject.Parse(text);
            dynamic meta = jsonObject.MetadataFields;

            dynamic zone = new JObject();
            zone.Name = "zone";

            List values = new List()
            {
               "value"
            };

            dynamic array = JArray.FromObject(values);
            zone.Values = array;
            zone.FieldType = 0;  // text type, need to change if other field type

            // title is a mandatory metadata field on the Page
            meta.Property("title").AddAfterSelf(new JProperty("zone", zone));

            Console.ReadLine();
        }
    }
}

DXA 1.7 includes Template Building Blocks that help us publish content in JSON format to the Tridion Broker database. One of these Template Building Blocks is the GenerateSitemap.tbb that publishes all Pages and Structure Groups from your entire website as 1 JSON file. You might think this sounds great – and it really is – one great big file! However, if you have thousands of Pages and hundreds of Structure Groups, you might just be interested to publish the Structure Group info and Index pages in the Navigation JSON file. In this short post I’ll share the code I used to get started.  If this is your first time hacking the DXA Template Building Blocks, you might want to check out my post here on how to get started compiling the DXA TBBs.

The idea here is to create a new TBB, GenerateNavigation tbb, that publishes just the Structure Groups and index pages.

I’ve re-used the entire GenerateSitempa.tbb file and just modified one of the methods.

Here is my new modified method and also the call to it.

 

public override void Transform(Engine engine, Package package)
{
    Initialize(engine, package);

    _config = GetNavigationConfiguration(GetComponent());
   
    SitemapItem sitemap = GenerateStructureGroupNavigation(Publication.RootStructureGroup, true);

    string sitemapJson = JsonSerialize(sitemap);

    package.PushItem(Package.OutputName, package.CreateStringItem(ContentType.Text, sitemapJson));
}

 

private SitemapItem GenerateStructureGroupNavigation(StructureGroup structureGroup, bool structureGroupsOnly)
{
    SitemapItem result = new SitemapItem
    {
        Id = structureGroup.Id,
        Title = GetNavigationTitle(structureGroup),
        Url = System.Web.HttpUtility.UrlDecode(structureGroup.PublishLocationUrl),
        Type = ItemType.StructureGroup.ToString(),
        Visible = IsVisible(structureGroup.Title)
    };

   
    foreach (RepositoryLocalObject item in structureGroup.GetItems().Where(i => !i.Title.StartsWith("_")).OrderBy(i => i.Title))
    {
        SitemapItem childSitemapItem = null;
        Page page = item as Page;
        if (page != null)
        {
            if (page.FileName == "index")  // Add pages with the name index - APA Custom
            {
                if (!IsPublished(page))
                {
                    continue;
                }

                childSitemapItem = new SitemapItem
                {
                    Id = page.Id,
                    Title = GetNavigationTitle(page),
                    Url = GetUrl(page),
                    Type = ItemType.Page.ToString(),
                    PublishedDate = GetPublishedDate(page, Engine.PublishingContext.TargetType),
                    Visible = IsVisible(page.Title)
                };
            }
        }
        else
        {
            childSitemapItem = GenerateStructureGroupNavigation((StructureGroup)item, true);
        }
        if(childSitemapItem != null)
        {
            result.Items.Add(childSitemapItem);
        }
    }
   
    return result;
}

In this article I’ll discuss the process of downloading and compiling the Default DXA TBBs, and then we can add our new TBB to the default DXA project.  You might want to do this so you can add another TBB into the DXA project, or to modify one of the existing ones.  However, the DXA team would like to get your updates as a Pull Request so they can make the existing ones even better.

Modifying the Default DXA TBBs

1. Download the sources (with the correct version selected) from here: https://github.com/sdl/dxa-content-management

2. Open the solution and look at the Properties, then Build Events. There is a post-build event with the following:
“C:\Program Files (x86)\Microsoft\ILMerge\ILMerge.exe” ^
/out:Sdl.Web.Tridion.Templates.merged.dll ^
/targetplatform:v4 ^
/lib:C:\_references\cm-8.1 ^
Sdl.Web.Tridion.Templates.dll DD4T.ContentModel.Contracts.dll DD4T.ContentModel.dll DD4T.Serialization.dll DD4T.Templates.Base.dll Newtonsoft.Json.dll /log

3. Check if you have ILMerge.exe in the folder C:\Program Files (x86)\Microsoft\ILMerge\ILMerge.exe. If not, then download here: https://www.microsoft.com/en-us/download/details.aspx?id=17630

4. Copy the DLLs from the Tridion/bin/client folder to a folder on your local drive. I prefer to keep the references as part of the project. For example, I use: C:\RC\dxa-tbbs-1.7\dxa-content-management-release-1.7\references

All DLLs are required, even the ECL ones, and they’re all listed on the README here: https://github.com/sdl/dxa-content-management. If you don’t have ECL installed, you’ll need to install it at least on your Dev server to get the DLLs. You can use the Add/Remove Programs and ‘Change’ option to add the feature. Restart required, because the GUI will complain after you install the ECL without a restart.  Also, The DLLs after the path are expected to be found in the /bin/debug folder of the project.

5. Build

Potential errors:
1. Error code 3 – This means Visual Studio cannot find ILMerge.exe
2. Error code 1 – It cannot find the DLLs folder specified in the post-build script

Tips:
– Use the /log switch in the post-build command to write the output to the ‘Output’ window for easier debugging

Happy hacking!

DXA and the SDL.Web.Tridion.dll

September 28th, 2017 | Posted by Robert Curlette in DXA - (0 Comments)

When creating a new DXA (1.7) Web Application we can use DXA Core sample website to get started or we can start fresh, and build it from ground up using the NuGet packages.  In a recent project we wanted to start fresh, only using the framework itself and not any of the samples provided OOTB.  While some may say we lose the ‘acceleration’ by taking this path, others could argue that in most client applications they prefer to have a clean solution where they know what all the code does, and why, and have no extra stuff that is not needed or used inside.  So, anyways, we decided to take the high road and start from a ‘file, new project’ approach.  It hasn’t been easy, but it’s been real.

While making the new project you will have almost everything you need – except for one very important and not included DLL – the SDL.Web.Tridion.dll file.  This is referenced from the Unity IoC container…and you can see it in the Unity.config file.  It is not referenced in the project references.  When you don’t have this file in the /bin folder, you will get the following error message:

The type name or alias DefaultCacheProvider could not be resolved. 
Please check your configuration file and verify this type name.

The solution is quite simple, but can be deceiving.  The DXA Sample Project .csproj file includes a very important command to tell the project to copy the DLL.

 <Target Name="BeforeBuild">
 <CallTarget Targets="CopyDxaFrameworkLibsToOutput" />
 </Target>

In your own .csproj file, copy the above config anywhere on the top level.  I placed mine before the final closing </Project> tag.  Now, re-open the project in Visual Studio and build, and you should see the friendly SDL.Web.Tridion.dll file in the bin folder and your website will be happy again.