Home > Blog

Attackmonkey Blog

Posts Tagged: Umbraco

Create a User For Content Changes in Code

I've not blogged for a while, and I've decided to doa series of shorter posts highlighting some useful techniques I've come across recently. Here's the first one!

In Umbraco, you may often have some task that manipulates content/media etc via the various service APIs. For example you might have a scheduled import task that imports pages froma feed, or a task that pulls in content or modifies content based on things happening in a thiord party system.

By default, all actions carried out using the services are flagged against user 0 (the default Admin user). This is fine, but it means it's hard to tell what changes were made by your tasks, and which were made by the ACTUAL administrator.

Most ofthe Save and publish methods have an additional parameter for User ID, which can be the ID of any back office user. The fun part, is that the user doesn't have to actually be able to access the back office. So you can create a user, called something like "Task User" (you could even create a different user for each task if you wanted), and ythen disable it's Umbraco access with the check box. Then update all of your code to include the user ID.

So this:

Services.ContentService.Publish(myContent);

Would become something like this:

Services.ContentService.Publish(myContent, TaskHelper.TaskUser);

This means you can now pin down in the audit trail if content issues were caused by actual CMS users, or by one of your tasks!

Why did I start doing this? One of my clients inherited a site that had a public page that had a link to call a controller that deleted ALL of the content of a specific type, that no one new about until they had a pen test done, and the testers clicked on the link, deleting a sizeable chunk of important content on the site. The delete by content type method also permanently deleted the pages in question (but not the sub-pages). As we didn't know the page existed, or that the pen test was being run, it took us a while to track down the culprit. Had we had actions flagged against a task user, we'd have known straight away where to look!

One thing to note, if you are manipulating content through the APIs, you should keep it to a minimum to avoid clogging up your versions table. Only update stuff if you need to. If you do need to use lots of updates, it's worth installing something like FALM Housekeeping, or Unversion to keep things under control.

Crazy Assed Shit With Contour, or How to Export ALL of Your Records as a Unified CSV

Recently I was tasked with doing some Crazy Assed Shit™ with Umbraco Contour. Note, this applies to old school Contour, the techniques outlined here won't work on Forms, as some of the required information is no longer in the database.

The client wanted to be able to export ALL of the forms on the site, merged into one giant CSV file. With the columns across the top of the spreadsheet. So effectively the output would be something like:

Form Name, Record ID, Date Submitted, Field 1, Field 2, Field 3, etc

Now, the requirement also stated that fields with the same caption should be merged together into one column.

Looking at the APIs, there are two options:

  • Use the library methods. There were three issues with this, one it returns XML, and two the caption names that are stored in the XML are only correct at the time of saving. If the field name has been updated, the caption will be wrong, so you'll end up with extra columns that shouldn't be there, finally, you have to get the results one form at a time, so there would be some horrific logic around grouping the fields.
  • Use the database driven API methods. This solves the stale caption names issue, but again you have to grab the results one form at a time, and the API is REALLY database intensive, which makes it pretty slow for deaing with large numbers of records at once.

So after scratching my head for a bit to think of a better way of doing it, preferebly in a single database call, I decided that you could do the whole thing, in one go (ish), using a pivot query. This is a special kind of query that allows you to turn rows into columns, they're commonly used in Excel and SQL Server reporting.

The only issue that would cause any problems is that you can't directly do a pivot query without knowing the column names. They HAVE to be hard coded into the query. The only way round that is to use dynamic SQL.

First things first, we're going to create a view that contains all of the raw data that we need to perform our magic. This will make the pivot query MUCH easier to read, and it also provides around a 20% perfomance boost to the final query, as we need to use the data twice (once to get all the captions, and once to actually do the pivot query). Create a new view, and use the following query:

SELECT dbo.UFRecordFields.Record, dbo.UFFields.Caption, dbo.UFRecords.Created AS [Date Submitted], dbo.UFRecords.IP, dbo.UFForms.Name AS [Contour Form Name], dbo.UFForms.Id, 
    CASE WHEN UFFields.PreValueProvider != '00000000-0000-0000-0000-000000000000' THEN STUFF
    ((SELECT     ',' + a.Value
        FROM UFPrevalues a LEFT JOIN
        UFRecordDataString b ON CAST(a.Id AS Nvarchar(255)) = b.Value
        WHERE a.Field = UFFields.Id AND b.[Key] = UFRecordFields.[Key] FOR XML PATH('')), 1, 1, '') ELSE COALESCE (CAST(UFRecordDataBit.[Value] AS Nvarchar(255)), 
        CAST(UFRecordDataDateTime.[Value] AS Nvarchar(255)), CAST(UFRecordDataInteger.[Value] AS NVarchar(255)), CAST(UFRecordDataLongString.[Value] AS nvarchar(MAX)), 
        UFRecordDataString.[Value]) END AS FieldValues
FROM dbo.UFRecordFields LEFT OUTER JOIN
    dbo.UFRecordDataBit ON dbo.UFRecordFields.[Key] = dbo.UFRecordDataBit.[Key] LEFT OUTER JOIN
    dbo.UFRecordDataDateTime ON dbo.UFRecordFields.[Key] = dbo.UFRecordDataDateTime.[Key] LEFT OUTER JOIN
    dbo.UFRecordDataInteger ON dbo.UFRecordFields.[Key] = dbo.UFRecordDataInteger.[Key] LEFT OUTER JOIN
    dbo.UFRecordDataLongString ON dbo.UFRecordFields.[Key] = dbo.UFRecordDataLongString.[Key] LEFT OUTER JOIN
    dbo.UFRecordDataString ON dbo.UFRecordFields.[Key] = dbo.UFRecordDataString.[Key] LEFT OUTER JOIN
    dbo.UFFields ON dbo.UFRecordFields.Field = dbo.UFFields.Id LEFT OUTER JOIN
    dbo.UFRecords ON dbo.UFRecords.Id = dbo.UFRecordFields.Record LEFT OUTER JOIN
    dbo.UFForms ON dbo.UFRecords.Form = dbo.UFForms.Id

You might get an error about the query, but it should still save (the error is because one of the functions can't be represented in the diagram pane if you have it open). Save the view as "vwContourFieldValues". The resulting view contains ALL of the submitted field values for every submitted record in the database. It also concatenates prevalue fields that allow multiple selections into a single comma separated string.

Now that the view is in place, we can write the SQL to pull out the pivot table. Basically, we will have to do two things. Firstly, pull out all of the column names to use in the query, and secondly build some dynamic SQL to use those column names in the pivot query.

Here's the code:

DECLARE @DynamicPivotQuery AS NVARCHAR(MAX)
DECLARE @ColumnName AS NVARCHAR(MAX)
 
--Get distinct values of the PIVOT Column (Contour Field Captions) 
SELECT @ColumnName= ISNULL(@ColumnName + ',','') 
    + QUOTENAME(Caption)
    FROM (
        SELECT DISTINCT Caption
        FROM vwContourFieldValues
    ) AS Courses ORDER BY [Caption]

    SET @DynamicPivotQuery = 
        N'SELECT [Contour Form Name], Record, [Date Submitted], IP, 
        ' + @ColumnName + '
            FROM vwContourFieldValues

            PIVOT(MAX(FieldValues) 
            FOR Caption IN (' + @ColumnName + ')) AS PVTTable
            ORDER BY [Contour Form Name] ASC, [Date Submitted] DESC'

            --Execute the Dynamic Pivot Query
            EXEC sp_executesql @DynamicPivotQuery

Once you have the query written, you can run it, and it will list all of the records in the database, with all of the columns, and all of the records with the same caption will be kept in a single column. You can then output the results to CSV, so the client can open it in Excel and do any crazy data mining or reporting that they desire.

This is a pretty simple example, but you could easily modify the query to filter the forms by date range, or allow them to choose the forms that they want to bulk export.

I've tested this query on a site with around 70 forms, and over 150,000 records, and it took just over two minutes to run. Given the resulting CSV file has something like 200 dynamic columns, that's not too bad. If you run it on a site where the forms share more fields, it's MUCH faster. You could probably make this even faster by using temporary tables, indexed views, or other performance optiomisations.

As I mentioned at the beginning of this article, the code here won't work on Forms, as the Forms themselves are no longer stored in the database. You'd have to do some crazy voodoo involving getting all the forms, and creating a temporary lookup table with all the GUID/Caption pairs for every form on the site. I'm not 100% sure on how you'd get the prevalues back either, as they also don't appear to be in the database any more.

I may try and figure it out though and post an example of how to do this in Forms if I work it out.

Running Umbraco Over Reverse Proxy

Recently I had to get Umbraco working over a reverse proxy, and I thought I'd share how I did it, as there's actually a pretty painless way to get it working!

First up, what is a reverse proxy? A reverse proxy is like a virtual folder, except that the contents of the virtual folder resides on another server. Why would you need to do this? Well, usually, if you need to run Umbraco in a virtual folder in a .Net site, you could use a normal virtual folder. However, security considerations might prevent installing the Umbraco site on the same box as the main site, or the main site might be written in PHP and hosted on a *nix box for example.

Out of the box, platforms like Apache support reverse proxying natively through Mod Rewrite. In IIS, you have to use something called Application Request Routing (ARR), combined with URL Rewritiung to make it work. These are optional add ons for IIS. If you're interested, here's some information on how to do it with Mod Rewrite, and here's how to set up ARR in IIS.

One of the things that you often have to do with reverse proxied stuff is to rewrite the content as it's sent back to correct image paths and links etc. Normally you'd do this with outbound rewrite rules. It's not too painful for simple sites, but for something like Umbraco, which is a very complex system, it can be quite difficult to do without accidentally breaking stuff (you also need to heavily rewrite the contents of JS and CSS files, which can be quite slow, and each extra plugin increases the work usually).

So I figured out a much easier way of getting it all to work. Lets say you have www.mysite.com and you want to reverse proxy the URL www.mysite.com/blog/ to blog.mysite.com, which is an Umbraco site on another server. All you need to do is set up the Umbraco site on blog.mysite.com to run in a virtual folder that matches the name that you want to use for the reverse proxy, e.g. blog.mysite.com/blog/. As Virtual folder setups work pretty well out of the box, this means that you only need an outbound rule for the blog.mysite.com domain, everything else should just work out of the box, as it already thinks it's in the /blog/ folder!

Making Your Packages Work in a Virtual Directory

Just before Christmas I worked on a couple of projects where we needed to have Umbraco running in a virtual directory. Out of the box, Umbraco itself works very well in a virtual directory these days. The main issue that I ran into was packages that didn't work correctly when Umbrco was in a virtual folder.

From going through all the affected packages in the projects, I've compiled a list of things that you can do in your packages to help ensure that it will run correctly when your site is running in a virtual folder. So without further ado, here are the top five things to be aware of:

1. Paths in your package manifest

In your package manifest, ensure that your paths start with a ~. E.g. instead of referencing a script as "/app_plugins/my_plugin/script.js" use "~/app_plugins/my_plugin/script.js". Umbraco will then work out the correct path when it loads in all of the manifests.

2. Paths in your CSS

You often see people make their image paths in the CSS relative to the root of the site. This won't work in a virtual folder, so make sure all of your image paths are relative to the location of your CSS file. That way the paths will work, regardless of where the Umbraco site lives.

3. Paths in your JS

Again, it's common to see links to things in the JS file be rleative to the root, e.g. "/umbraco/surface/mysurface/get". Your JS will be executing from the /umbraco/ path, so code your paths accordingly. E.g. in the previous example, you could use "../umbraco/surface/mysurface/get".

4. Paths in your templates

As with paths in your JS, code the paths to images and other resources in your templates/views as if they're in the /umbraco/ folder. So instead of "/app_plugins/myplugin/images/", use "../app_plugins/myplugin/images/".

5. Paths inside your code

If you're mapping paths inside your code for things like Controllers and other compiled resources, don't forget the ~ at the start of the URL! this will then map the path to include the virtual folder as well.

And that's about it. If you follow those five guidelines, your package should work on an Umbrco site that's running in a virtual folder!

 

Upgrading to 7.3: Part 3 - Switching to Razor

In part 2 of this series, I talked about upgrading the site to 7.3. At this point we have a nice shiny site, but all of the additional functionality is currently in XSLT. The client this is for prefers their sites in Razor these days, so as an added thing to do, I now have to port the site over to Razor.

In the back office, all of the old templates were still listed, but as I hadn't copied the master pages across from the old site (as we're switching to MVC), they'rer all empty. It was fairly trivial to then just re-save each of those (which created the physical view pages in my project), and then adding a Layout file with the outer page. I copied all the HTML from the old templates, with a few minor changes. I was able to remove a couple of templates, and combine a couple of others!

The old site is nearly 5 years old, so pretty much everything is XSLT/Macros. Razor allows you to move everything into partials and controllers, so looking at the code, it's possible to do away with all of the Macros bar the one that allows you to embed a special corporate video into the Richtext editor.

I went through the site and converted all the old macro based functionality into Razor. Some of it could just be inlined in the templates, as it was template specific, the rest was turned into plain old partials.

A few parts of the code made sense to be surface controllers (the search, and a user profile page, as well as the login stuff). THese were pretty easy to port over.

With the XSLT, al ot of the code could just be ported over and rewritten. Quite a few macros were no longer needed, or could be combined. By the time I was done, I only had a handful of Partials, compare to the 20+ Macros on the old version of the site.

Once I'd tested everything to make sure it still worked compared to the old version of the site, it was time to tidy up. All of the old Macros and XSLT files were deleted from Umbraco.

At this point, we have a fully upgraded site, now running on the latest version of Umbraco! As an added bonus, between starting this and finishing, Umbraco crept up to 7.3.4, but a quick "Update-Package UmbracoCms" in Nuget in VS sorted that out very quickly and efficiently!

The site and database can now be deployed, replacing the old site, and we're good to go!

All in all, the process has been significantly less hassle than I expected, and I'm really imprressed with how straightforward it all was. I'd be much less hesitant to recommend upgrading a fairly straighforward Umbraco site than I might have been previously. I'd love to hear if anyone else has tried this, and whether your experiences were as painless as mine!