Author Archives: Robert Curlette

About Robert Curlette

Initiated into the Tridion ways at a young age, Robert has traveled the world using his Tridion knowledge to help other new Tridion initiates learn the secrets of building websites with the Tridion CMS. He also organizes the Tridion Developer Summit, where the Tridion community gets together to share their stories and experience.

Tridion Upgrades – Approach and Customizations

December 27th, 2018 | Posted by Robert Curlette in Tridion | Upgrades - (0 Comments)

As 2018 is coming to a close many people are planning their projects next year and an upgrade might be part of it.  The good news with the latest SDL Tridion Release, named ‘Tridion Sites 9’, is that the word ‘Tridion’ is back in the name!  OK, all kidding aside, the other good news is that little has changed in the product since Web 8 and Web 8.5.  Meaning, if you’re thinking to upgrade from 8.1 or 8.5, expect few hidden bumps or issues.  However, the bad news is also that little has changed since Web 8.1 or Web 8.5.  So, if you were expecting to see some of the suggestions on the SDL Ideas Site, then you’ll have to wait a bit longer.  One long-standing request that is implemented (I’m aware of it since at least 2007) is image cropping and re-sizing within the backend GUI.  Yay!  And if you’re curious what’s coming out in 2019 and beyond, watch the Tridion Developer Presentation by the SDL team about the GUI vNext here.  For a complete list of the new features in SDL Tridion Sites 9, check out the release notes.  Despite the good and bad news, the fact is that it is important we keep our software current, as it both gives us the latest and greatest version, happier users, a larger version number (just kidding!), and also longer support.  In this article I will give a list of some of the things to consider when upgrading.

Upgrade Process

The upgrade process is usually somewhat confusing, as the architecture of Tridion in an enterprise setup can be complex and a bit overwhelming.

I tend to think of it from a bird’s eye perspective, and divide the architecture into 3 large pieces:  Content Management, Publishing, and Content Delivery.  These pieces all affect various parts of the organization and the ways they work – and this is something we should consider as part of our plan when we upgrade various mission-critical pieces to the new version.

Before we begin, if you’re using a web framework such as DXA, then you need to check if the version of DXA is compatible with the version of the Tridion upgrade.

First, I always suggest to start with a ‘sandbox’ server, which can be a copy of the Development environment.  This is a place where if we make any mistakes, we won’t do any harm.  We try to get as close to production as possible, but sometimes we might not have various components such as load balancing configurations or other third-party systems.  Regardless, we try our best to have a fairly decent environment here that resembles production.  We should also have all of our latest Templates, Event System Code and other Tridion extensions in this environment.  This is where most of our risk lies, and where we will want to have time and a safe place to try the old code in the new version – to fix any potential issues that pop up.  Once we analyze and try out our customizations in the Tridion environment and prove they don’t break the newly upgraded system, then we can relax that most of the risk of the upgrade is handled.

In the sandbox environment, most of the customizations and extensions in the Tridion system are in the CMS or the Publishing side, although many people also extend and use the Content Delivery APIs as well (using Java or .Net).

I suggest we start with the Content Manager server upgrade, as it is fairly quick and fairly painless, and the CME / DB usually upgrades without any major issues.  This then allows us to test our Templates and backend customizations.  Next up would be the Publisher upgrade, and again it is fairly painless.  Finally, we would upgrade Content Delivery, and the Micro Services.  This is a bit more time consuming and also can be trickier, given the number of moving pieces and numerous configuration files.  However, if you’re upgrading from 8.1 or 8.5, your experience should be fairly smooth, since as I mentioned above, little has changed…  Please refer to the SDL Docs Upgrade documentation for more details.

The first thing we want to try after we upgrade is to Publish.  This tests that all the layers of the architecture are working together and we can deploy content changes.  If the publishing doesn’t work, common issues are a configuration issue in the Deployer_conf file (check the Database settings, etc) or that the items hang in the Publishing queue and might need a Publisher restart.  Once items are publishing you can move onto testing Publishing other pages, and page types.  We usually suggest to try to re-publish an entire site, or sites, if it is possible within your environment.  If not, then select a subset of pages which represent the main page types and publish them to be sure the mechanism works for these as well.

Next we would want to look at any advanced extensions or customizations.  I would suggest that someone who has been a member of the Dev team for a while make a list of the customizations, the locations of the files, what the customization does, and how to test it.  This is very helpful not only during the upgrade, but in the future for testing as well.

If you’re using the excellent Alchemy Framework, make sure to check if the plugins continue to work with the new GUI, and also check to see if a new version of Alchemy is out and you could upgrade to take advantage of new features and compatibility.  Other common things to check are ‘Custom Pages’, Event System code, GUI Extensions, Core Service apps and templates that call external services.

Assuming everything still works, then we would want to update the Event System, GUI Extension, and Core Service apps to reference the latest Tridion Sites 9.0 DLLs instead of the 8.1 or 8.5 ones.

And, again, test everything to make sure that nothing breaks, and celebrate that this step in the upgrade has gone well.

Next up would be to look at any Publisher / Deployer customizations, as it is common to publish content to various search engines or databases.  Usually this code is written in Java, and with a bit of luck you will have a Java developer handy that is familiar with the code and Tridion, and can update the app to use the latest Tridion JARs, and test it on his local development machine.  This sometimes can be a but tricky if the developer doesn’t know Java, or doesn’t know Tridion.  So, if you have one of these types of extensions, then before the upgrade begins you might want to confirm the availability of the Java developer to support your team.  He or she will not need to help out in the above steps.  Also, the upgrade can be done on a local development machine and tested with Deployment Packages, and in that way the Java dev can use their local IDE such as Eclipse or NetBeans to debug the code.

Finally, if we are using any Content Delivery APIs, we would want to upgrade our WebApps to take advantage of the latest features.  This includes both .Net and Java environments.  The Tridion installation media includes the latest DLLs or JAR files, and we would then need to update our WebApps and re-compile them to take advantage of the latest version.  If you have some automated tests of the live website and it’s possible to run them here, that would be great and very helpful.  Also, it’s a good time to check some basic performance numbers, if possible.

If you’re using a Web Framework such as DXA, then you might need to also upgrade that to the latest version to be compatible with the latest Tridion version.  This would be part of the ‘upgrade WebApp’ step.

At this point most of the article and attention has been on customizations and the ‘risky’ part of the upgrade.  I haven’t discussed TopologyManager or MicroServices in detail, as they already exist in Web 8.1 and 8.5 and should continue to operate in more or less the same way, without any major changes.  If you’re upgrading from Tridion 2013 then you would need to plan a but more time to handle the MicroServices and TopologyManager sides of the upgrade.

So, we come back to our original question:  Why Upgrade?  It’s a good one and it’s sometimes hard to answer.  With large software packages that rely on many external systems (Database, Server, etc) it is good that we maintain them and keep them all in alignment – so they can all operate efficiently and also as expected.  We also stay on a same version as other customers, and with that, we benefit from any hotfixes or service packs that solve open issues.  Finally, we help provide our end users with the best that Tridion Sites has to offer at this point in time.  I’m a big believer that we can always do better, and if you’re using an older version of Tridion, probably the business and users would be happier and more efficient using the latest version of the product.

If you need support or help for an upgrade please feel free to contact me at







DevReach Conference 2018

December 6th, 2018 | Posted by Robert Curlette in .NET | Conference - (0 Comments)

Being a Developer means we must constantly update our knowledge with the latest tools and techniques, frameworks and cloud options. One of the best ways to do that is attend a developer conference, where we can get a lot of information in a few days, and also network with fellow developers.

This year I was happy to return to DevReach in Sofia, an event organized by Progress Software (in the past, Telerik) and focussed on the Microsoft .Net technology stack. The first time I attended was in 2013 at a local cinema in Sofia, and I was impressed by the quality of the organization and high quality of talks an speakers. The event was one of the inspirations for me when I created the Tridion Developer Summit conference. This year did not disappoint! We were back to the cinema, with a very full agenda with several tracks and high quality speakers. The speakers were all industry experts, mostly from Microsoft or Progress software, and were very familiar with the content they spoke about.

A full list of sessions is available at

They also have a Facebook page with photos of the event.

All talks were recorded and will show up on the Progress YouTube channel here:

The agenda was littered with talks on AI and Machine Learning, Serverless and Docker, and also cloud and Azure. We had a few sessions about .Net, and these were well attended, with people standing in the aisles, or sitting on the floor in front, all with their eyes glued to the code on the screen, and listening to the speakers unraveling the code on the screen in front of us. The talks were very well attended and I had to sit on the floor a couple of times to attend a talk. A good sign, if you ask me!

One of my favorite talks at the event was the Docker presentation by Chris Klug ( He not only knew his content, he presented it in a very funny and interesting way. He walked us through why to use Docker, and how he uses it to run .Net Core in an isolated and trusted way on his laptop. Slides were at a minimum and he spent lots of time at the command prompt, issuing commands like a true console ninja, and wow’ing us in the process. He ran over on time, but he still wanted to show his last demo, so he fired off a few commands at the command prompt, hit return, waited 10 seconds, and it all worked…or it seemed to.

Another nice talk was the session after his was by Tibi Covaci (from Codemag) on the state of .Net – what is happening in the Core and with the latest version of the .Net Framework 4.6. This was a really nice summary and the presenter knew his content really well.

I also attended a few AI talks, which mostly focussed on the business case and were an introductory level, showing code only within Powerpoint. I hope next year we see more live demos and live coding, and also more deep talks.

On day 2 I was really looking forward to the Sitefinity Headless CMS, since I have a background in Content Management and wanted to see how Sitefinity would handle this new trend. After some basic concepts were explained, we were shown the OData response in the browser, and then shown a WebApp and Mobile App that consumed the CMS data from the OData service. I have used OData in the past and not been too impressed with it. Seems to get difficult when filtering results, or doing more advanced queries of the dataset. One of the participants made some comments about exactly this, and then said he had later figured it out. But, unfortunately the talk was very short, using less than half of the time for the talk, and didn’t seem well prepared. I would have liked to see more code, more behind the scenes of how he consumed the data, and in general more discussion around the idea of headless versus standard.

After this talk I listened in on a .Net Rocks live show, always funny and a pleasure to listen to the topics discussed and how they move through so many different ideas. The discussion went to Serverless and tooling, with one of the guests speaking about VIM and her use of the editor. Others spoke about VS Code, and how helpful the debugger is in it.

Finally, the day wrapped up with a very inspirational talk from Neal (reverentgeek), titled ‘You Are Awesome!’. He comes from my home State of Georgia, and it was nice to hear a familiar accent. 🙂 Very inspiring, authentic and from the heart, Neal shared with us that we can all get better at something by putting in the time and dedication to the task, and continue practicing. For example, a few years ago Neal discovered he likes to draw, made some sketches for a conference talk of a man, and then 3 years later made the same sketch. The level of detail and quality had improved a lot! That reminded me that we need to practice a bit, every day, and to be nice to ourselves and allow us to fail sometimes, but keep moving forward. Big thanks to Neal for sharing his honest story.

We were all invited to a 10 year celebration party at a nearby hotel, where a famous Bulgarian rock band performed while we networked with the speakers and other attendees.

Overall, the event was very well organized and offered a lot to the participants. Big thanks to Progress for hosting the event! I hope they are able to continue providing such a high quality Developer event to the Balkan region.

DXA JSON 2 – Honey we shrunk the JSON

June 5th, 2018 | Posted by Robert Curlette in DXA - (1 Comments)

DXA 2 promises to publish less verbose JSON, sending smaller JSON items into the Publishing Queue and therefore speeding up our publish times.  It will also consume less space in the Broker DB.  So they say!  But, what does the new and old JSON look like and is it much lighter?  In this article I’ll highlight only the diffs between the rendering of a Keyword field JSON and Textfield JSON.  And, you might be wondering why I even care (aside from curiosity), but I’ve built a TBB that amends the Published DXA JSON and injects Structure Group Metadata fields masquerading as Page Metadata fields (in the JSON) and therefore available to our DXA Frontend WebApp.  OK, so here it is:

DXA 1.7 Keyword Field

"people": {
 "Name": "people",
 "Values": [ "Public and Member Communications", "Public Interest people", "Publications and Databases" ],
 "NumericValues": [ ],
 "DateTimeValues": [ ],
 "LinkedComponentValues": [ ],
 "FieldType": 3,
 "CategoryName": "people List",
 "CategoryId": "tcm:11-11393-512",
 "XPath": "Metadata/custom:people",
 "KeywordValues": [
 "IsRoot": false,
 "IsAbstract": false,
 "Description": "",
 "Key": "",
 "TaxonomyId": "tcm:11-11393-512",
 "Path": "\\people List\\Public and Member Communications",
 "RelatedKeywords": [ ],
 "ParentKeywords": [ ],
 "MetadataFields": { },
 "Id": "tcm:11-106852-1024",
 "Title": "Public and Member Communications"
 "IsRoot": false,
 "IsAbstract": false,
 "Description": "",
 "Key": "",
 "TaxonomyId": "tcm:11-11393-512",
 "Path": "\\people List\\Public Interest people",
 "RelatedKeywords": [ ],
 "ParentKeywords": [ ],
 "MetadataFields": { },
 "Id": "tcm:11-106848-1024",
 "Title": "Public Interest people"
 "IsRoot": false,
 "IsAbstract": false,
 "Description": "",
 "Key": "",
 "TaxonomyId": "tcm:11-11393-512",
 "Path": "\\people List\\Publications and Databases",
 "RelatedKeywords": [ ],
 "ParentKeywords": [ ],
 "MetadataFields": { },
 "Id": "tcm:11-106853-1024",
 "Title": "Publications and Databases"

DXA 2.0 Keyword Field

"people": {
 "$type": "KeywordModelData[]",
 "$values": [
 "Id": "106852"
 "Id": "106848"
 "Id": "106853"

DXA 1.7 Text Field

"language": {
 "Name": "language",
 "Values": [ "English" ],
 "NumericValues": [ ],
 "DateTimeValues": [ ],
 "LinkedComponentValues": [ ],
 "FieldType": 0,
 "XPath": "Metadata/custom:language",
 "KeywordValues": [ ]

DXA 2.0 Text Field

"language": "English",


So, there we have it, the new DXA 2.0 JSON delivers what it promises – much leaner and meaner JSON for the benefit of us all.

During a recent DXA project we experienced strange errors when publishing some content.  We are moving to DXA and the new DXA templates try to render every field of the Component and linked-to Components.  This is usually great, but if your content is a bit stale or outdated, or possibly contains an invalid value (like mine did) then the following script might be helpful.  (You should create a normal Core Service app, mine is a Console App, and reference all the usual Tridion Assemblies).

It was complaining that a Category did not contain the Keyword.  The error message was ‘Keyword with title ‘Long Lost Keyword’ does not exist in Category ‘Amazing Category’ [tcm:7-12345-512].’ After some digging we found a Multimedia Component with invalid Metadata.   To view the invalid Metadata we used the excellent Tridion Alchemy Plugin Show Item XML from the team at Content Bloom.  If you haven’t yet tried Alchemy yet, now is the perfect time, and this plugin alone makes it worth the (free) install.  You can even just install it on your Dev server if you want.

The simple solution for our invalid metadata problem would be to change it in the GUI, but the GUI didn’t show the value, and we were stuck.  So, we decided to write a small Core Service script that updates the Metadata field or Removes it.  Hope this helps.


Change the PublishState of an Item

April 4th, 2018 | Posted by Robert Curlette in Tridion | Tridion Core Service - (0 Comments)

Currently as part of the Publish process, in a post-build event I am sending JSON to an external search engine.  As part of that process, I wait for a response from the search engine that the content arrived successfully.

However, when it doesn’t arrive, I wold like to notify authors via the PublishQueue status that it didn’t get there.  One solution is to update the Publish Status on the published item, setting it to Warning or Failed, and also update the text in the Publish Transaction.  The code below shows how we can do this.  I implemented it as a WebService, so it is possible to be called from any external system, including the event system.

 public void Get(string transactionUri)
      string uri = "tcm:" + transactionUri;
      string binding = "netTcp_201501";
      SessionAwareCoreServiceClient client = new SessionAwareCoreServiceClient(binding);
      PublishTransactionData trans = client.Read(uri, new ReadOptions()) as PublishTransactionData;
      string title = trans.Title;
      trans.State = PublishTransactionState.Warning;
      trans.Information = "Didn't make it to Search Index - please publish again";
      client.Update(trans, new ReadOptions());


Tips for using JSON.NET to Update existing JSON Objects

January 24th, 2018 | Posted by Robert Curlette in .NET - (0 Comments)

Here are some helpful tips when working with JSON.NET.  It is sometimes challenging to find the right method or property to use because we don’t have access to intellisense when using the ‘dynamic’ object type often preferred by JSON.NET.Here are some helpful tips when working with JSON.NET.  It is sometimes challenging to find the right method or property to use because we don’t have access to intellisense when using the ‘dynamic’ object type often preferred by JSON.NET.
The context is that I get an existing JSON document, and I need to add new properties and nodes into the document.

Hope this was of some help to you with providing some more examples of the excellent JSON.NET library.  If you have any more tips or suggestions for the code I am happy to hear them.  Thanks.

Accessing an Embedded Schema field Definition in a C# TBB

January 22nd, 2018 | Posted by Robert Curlette in .NET | Tridion - (0 Comments)

While creating some JSON for the Metadata on a Structure Group in C# I came across an interesting challenge to display the URI and Title of the Embedded Schema. Initially I was using a FieldDefinition instead of an EmbeddedFieldDefinition and then the ID and Title properties were not available of the Embedded Field. Casting the variable to an EmbeddedSchemaFieldDefinition was the proper way to access the EmbeddedField definition info.


EmbeddedSchemaField right1Field = pageMetaFields[fieldName] as EmbeddedSchemaField;
EmbeddedSchemaFieldDefinition def = right1Field.Definition as EmbeddedSchemaFieldDefinition;

embeddedSchemaFieldDef.RootElementName = def.EmbeddedSchema.RootElementName; //"Content";
embeddedSchemaFieldDef.Id = def.EmbeddedSchema.Id; // "tcm:11-123-8";
embeddedSchemaFieldDef.Title = def.EmbeddedSchema.Title; // "Metadata fieldname";

Amending DXA JSON

January 18th, 2018 | Posted by Robert Curlette in DXA - (0 Comments)

Sometimes you may wish to add additional content into the default DXA JSON content that is published to the Broker that are not part of the default Component Fields. In my situation I would like to have fields from the Page and Structure Group Metadata available in my DXA view. While using DXA 1.7 there is no out of the box, or ‘accelerated’, way to do this. My idea is to create an additional C# TBB, and in that access the DXA JSON and also add a field to it. The advantage to this approach is that the additional field will be seen by the DXA runtime as a Page Metadata field, and therefore serialized and available to us in the View without doing anything special on the DXA Frontend Webapp.

In this post I will share a sample app I used to access the DXA JSON and add an additional property into the MetadataFields collection. It was quite tricky to get this code to work, as we need to use the ‘dynamic’ type in C#, and this is without intellisense, so finding the appropriate methods and properties to use was a bit of a challenge. You will need to add the Newtonsoft JSON library from Nuget to get the code to work.

In a future article I will share more about the C# TBB that uses this code. The dxa.json file is the output of the Generate Dynamic Page (DXA) TBB form Template Builder. Also, in the code below, I only show an example for a text field, and I also do not product the JSON for XPM – as I have no intention of using XPM on these ‘fake’ Page Metadata fields.

using System.Collections.Generic;
using Newtonsoft.Json.Linq;
using System;

namespace temp1
    class Program
        static void Main(string[] args)
            string text = System.IO.File.ReadAllText(@"C:\RC\dxa.json");
            dynamic jsonObject = JObject.Parse(text);
            dynamic meta = jsonObject.MetadataFields;

            dynamic zone = new JObject();
            zone.Name = "zone";

            List values = new List()

            dynamic array = JArray.FromObject(values);
            zone.Values = array;
            zone.FieldType = 0;  // text type, need to change if other field type

            // title is a mandatory metadata field on the Page
            meta.Property("title").AddAfterSelf(new JProperty("zone", zone));


Creating a PublishNavigation JSON TBB in DXA

December 3rd, 2017 | Posted by Robert Curlette in DXA - (1 Comments)

DXA 1.7 includes Template Building Blocks that help us publish content in JSON format to the Tridion Broker database. One of these Template Building Blocks is the GenerateSitemap.tbb that publishes all Pages and Structure Groups from your entire website as 1 JSON file. You might think this sounds great – and it really is – one great big file! However, if you have thousands of Pages and hundreds of Structure Groups, you might just be interested to publish the Structure Group info and Index pages in the Navigation JSON file. In this short post I’ll share the code I used to get started.  If this is your first time hacking the DXA Template Building Blocks, you might want to check out my post here on how to get started compiling the DXA TBBs.

The idea here is to create a new TBB, GenerateNavigation tbb, that publishes just the Structure Groups and index pages.

I’ve re-used the entire GenerateSitempa.tbb file and just modified one of the methods.

Here is my new modified method and also the call to it.


public override void Transform(Engine engine, Package package)
    Initialize(engine, package);

    _config = GetNavigationConfiguration(GetComponent());
    SitemapItem sitemap = GenerateStructureGroupNavigation(Publication.RootStructureGroup, true);

    string sitemapJson = JsonSerialize(sitemap);

    package.PushItem(Package.OutputName, package.CreateStringItem(ContentType.Text, sitemapJson));


private SitemapItem GenerateStructureGroupNavigation(StructureGroup structureGroup, bool structureGroupsOnly)
    SitemapItem result = new SitemapItem
        Id = structureGroup.Id,
        Title = GetNavigationTitle(structureGroup),
        Url = System.Web.HttpUtility.UrlDecode(structureGroup.PublishLocationUrl),
        Type = ItemType.StructureGroup.ToString(),
        Visible = IsVisible(structureGroup.Title)

    foreach (RepositoryLocalObject item in structureGroup.GetItems().Where(i => !i.Title.StartsWith("_")).OrderBy(i => i.Title))
        SitemapItem childSitemapItem = null;
        Page page = item as Page;
        if (page != null)
            if (page.FileName == "index")  // Add pages with the name index - APA Custom
                if (!IsPublished(page))

                childSitemapItem = new SitemapItem
                    Id = page.Id,
                    Title = GetNavigationTitle(page),
                    Url = GetUrl(page),
                    Type = ItemType.Page.ToString(),
                    PublishedDate = GetPublishedDate(page, Engine.PublishingContext.TargetType),
                    Visible = IsVisible(page.Title)
            childSitemapItem = GenerateStructureGroupNavigation((StructureGroup)item, true);
        if(childSitemapItem != null)
    return result;

DXA Default Template Building Blocks – Updating

December 3rd, 2017 | Posted by Robert Curlette in DXA - (0 Comments)

In this article I’ll discuss the process of downloading and compiling the Default DXA TBBs, and then we can add our new TBB to the default DXA project.  You might want to do this so you can add another TBB into the DXA project, or to modify one of the existing ones.  However, the DXA team would like to get your updates as a Pull Request so they can make the existing ones even better.

Modifying the Default DXA TBBs

1. Download the sources (with the correct version selected) from here:

2. Open the solution and look at the Properties, then Build Events. There is a post-build event with the following:
“C:\Program Files (x86)\Microsoft\ILMerge\ILMerge.exe” ^
/out:Sdl.Web.Tridion.Templates.merged.dll ^
/targetplatform:v4 ^
/lib:C:\_references\cm-8.1 ^
Sdl.Web.Tridion.Templates.dll DD4T.ContentModel.Contracts.dll DD4T.ContentModel.dll DD4T.Serialization.dll DD4T.Templates.Base.dll Newtonsoft.Json.dll /log

3. Check if you have ILMerge.exe in the folder C:\Program Files (x86)\Microsoft\ILMerge\ILMerge.exe. If not, then download here:

4. Copy the DLLs from the Tridion/bin/client folder to a folder on your local drive. I prefer to keep the references as part of the project. For example, I use: C:\RC\dxa-tbbs-1.7\dxa-content-management-release-1.7\references

All DLLs are required, even the ECL ones, and they’re all listed on the README here: If you don’t have ECL installed, you’ll need to install it at least on your Dev server to get the DLLs. You can use the Add/Remove Programs and ‘Change’ option to add the feature. Restart required, because the GUI will complain after you install the ECL without a restart.  Also, The DLLs after the path are expected to be found in the /bin/debug folder of the project.

5. Build

Potential errors:
1. Error code 3 – This means Visual Studio cannot find ILMerge.exe
2. Error code 1 – It cannot find the DLLs folder specified in the post-build script

– Use the /log switch in the post-build command to write the output to the ‘Output’ window for easier debugging

Happy hacking!