Menu

How to fix comment character encoding in TortoiseHG

I imported a couple of repositories from SVN into Mercurial and discovered that characters not present in the standard ASCII table have become mangled in the comments… Or at least they looked mangled in the console output as well as in TortoiseHG – now, the console is not that important, but how to fix this in Tortoise?

I tried searching for a solution on how to modify the import process and found nothing. Tried to add a new comment to the repository with a non-ASCII character and got a Python error (“expected string, QString found”). Some said that I should change my Windows’ default system encoding (which is English(US)), and that solved the problem but I would have liked a simpler solution, since changing the default encoding used to cause other problems in the past. I managed to find a couple of workarounds that solve the problem of console display and involve setting environment variables… Would it work for Tortoise? Actually: it does. The solution is simple: go to (this is on Windows 7) Control Panel – System – Advanced System Settings – Environment Variables, add a new user variable called HGENCODING and set it’s value to either “utf-8” or your code page (mine is “cp1250”). TortoiseHg respects this. There’s a slight difference in the two values, though, because the diff viewer doesn’t really like “utf-8”, it prefers the concrete code page. There may be other components that behave like this, so I suppose that setting the code page is the optimal solution.

Visual Studio 2010 cannot reference ManagedDTS dll from SQL Server 2005

A C# project that worked with Visual Studio 2008, when converted to Visual Studio 2010, starts complaining about not being able to find classes defined in Microsoft.SQLServer.ManagedDTS.dll and others. These dlls are contained in the SQL Server 2005. If you try to remove the reference and add it again, the errors disappear in the editor, but appear again when you compile the solution. At the end of the jumble of compiler errors there is a small one that betrays the cause:

warning MSB3258: The primary reference "Microsoft.SQLServer.ManagedDTS, Version=9.0.242.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91, processorArchitecture=MSIL" could not be resolved because it has an indirect dependency on the .NET Framework assembly "mscorlib, Version=2.0.3600.0, Culture=neutral, PublicKeyToken=b77a5c561934e089" which has a higher version "2.0.3600.0" than the version "2.0.0.0" in the current target framework.

The problem lies in the Microsoft.SQLServer.msxml6_interop.dll that references the beta version of the .Net framework 2.0. Yes, even after installing three service packs – and worse still, even if you install SQL Server 2008 it will remain there. Why? Apparently, there’s a newer msxml6_interop dll with this reference fixed but unfortunately it has the same version as the old one so it doesn’t replace it in the GAC. Talk about eliminating DLL hell.

But that’s not all, you cannot simply find the new dll and replace it in the GAC. The old one cannot be removed because it’s referenced by the Windows Installer. You have to use brute force, something like this: open the command prompt and try to find the real path to the assembly on the disk. (From Windows Explorer you cannot do this because it replaces the real GAC folder structure with a conceptual, flat view). So, CD to c:\Windows\Assembly and find the folder called Microsoft.SqlServer.msxml6_interop. In it, there will be another folder called something like 6.0.0.0__89845dcd8080cc91, and in it the dll we’ve been looking for. On my computer, the full path is

c:\windows\assembly\GAC_MSIL\Microsoft.SqlServer.msxml6_interop\6.0.0.0__89845dcd8080cc91

Ok, now you should be able to manipulate the dll directly and replace it with the new one. What I like to do in these cases is SUBST the folder and make it accessible from Windows Explorer. Type something like this -

SUBST x: c:\windows\assembly\GAC_MSIL\Microsoft.SqlServer.msxml6_interop\6.0.0.0__89845dcd8080cc91

- and you will be able to see the folder in Windows Explorer as a separate volume X:. From here you can delete the existing file and copy over the newer one. You can find the new one only if you have a machine where SQL Server 2008 is installed first – it’s in the same (or similar) place in the GAC. I used again the command prompt trick to get the file. (Note that I did everything as administrator, you might have to employ additional tricks to work around security).

Here’s a more detailed description with other possible solutions:

http://blogs.msdn.com/b/jason_howell/archive/2010/08/18/visual-studio-2010-solution-build-process-give-a-warning-about-indirect-dependency-on-the-net-framework-assembly-due-to-ssis-references.aspx

How to fix CAB to support dependencies across class hierarchy

The Composite UI Application Block’s Object Builder doesn’t support dependencies for same-named properties at different levels in the class hierarchy. If you add a dependency property which has the same name as a property in a base or derived class, only one of them will be initialized.

The reason for this is probably that the mechanism is based on the Type.GetProperties() method. This method doesn’t return all of the properties the class (and the base classes) contain – rather, it employs a “hide by name and signature” convention and gives only the topmost properties. So the first step we have to do is eliminate the GetProperties method. We do this by modifying the GetMembers() method of the PropertyReflectionStrategy (located in ObjectBuilder/Strategies/Property). It should look like this:

protected override IEnumerable<IReflectionMemberInfo<PropertyInfo>> GetMembers(IBuilderContext context, Type typeToBuild, object existing, string idToBuild)
{
    foreach (PropertyInfo propInfo in GetPropertiesFlattened(typeToBuild))
        yield return new PropertyReflectionMemberInfo(propInfo);
}

private IEnumerable<PropertyInfo> GetPropertiesFlattened(Type typeToBuild)
{
    for (Type t = typeToBuild; t != null; t = t.BaseType)
    {
        foreach (var pi in t.GetProperties(BindingFlags.Public | BindingFlags.Instance | BindingFlags.DeclaredOnly)) // get only properties in this class
        {
            yield return pi;
        }
    }
}

The next problem arises because the PropertyReflectionStrategy keeps a dictionary of existing properties. It’s indexed by property name, which would eliminate our duplicate properties. We have to change it to use the full path - property name prefixed by class name and namespace. I did this by adding a property called FullName to the IReflectionMemberInfo and ReflectionMemberInfo (found in ObjectBuilder/Strategies).

In IReflectionMemberInfo, add:

string FullName { get; }

In ReflectionMemberInfo, add:

public string FullName
{
    get { return memberInfo.DeclaringType.FullName + "." + memberInfo.Name; }
}

There’s a PropertyReflectionMemberInfo class embedded in the PropertyReflectionStrategy, we have to add a similar property to it:

public string FullName
{
    get { return prop.DeclaringType.FullName + "." + prop.Name; }
}

Ok – next, in the PropertyReflectionStrategy we rewire the dictionary to use this new property. Go to AddParametersToPolicy method and change this -

if (!result.Properties.ContainsKey(member.Name))
    result.Properties.Add(member.Name, new PropertySetterInfo(member.MemberInfo, parameter));

- to this -

if (!result.Properties.ContainsKey(member.FullName))
    result.Properties.Add(member.FullName, new PropertySetterInfo(member.MemberInfo, parameter));

One last glitch to fix: go to CompositeUI/WorkItem class, and in the BuildUp() method change this -

propPolicy.Properties.Add("Parent", new PropertySetterInfo("Parent", new ValueParameter(typeof(WorkItem), null)));

- to this -

propPolicy.Properties.Add("Microsoft.Practices.CompositeUI.WorkItem.Parent", new PropertySetterInfo("Parent", new ValueParameter(typeof(WorkItem), null)));

Without this modification, the root WorkItem would have its Parent property reference itself, and it would not be recognized as root WorkItem because it’s Parent property is not null. As a consequence, some initialization methods would not get called and almost nothing would work.

How to make LINQ to NHibernate eager-load joined properties like the Criteria API

In terms of “lazyness” of a property, there are currently three different ways in which it can be mapped in NHibernate:

  • lazy=”false” means that it’s not lazy at all – the property’s content will be loaded along with its owner object. This means additional data is always loaded when you load an object, and may mean additional sql queries, too.
  • lazy=”proxy” means that the object contained in the property is loaded when any of its public members is accessed. This property will contain an instance of a proxy, which is an object that knows how to initialize itself on-access. It performs the initialization not by loading its properties but by loading an instance of a real object and redirecting its properties and methods to it. This is why everything on the class that is to be proxied needs to be virtual: proxy is an instance of a class derived from it, which is dynamically generated and has every public member overridden to support lazy-loading.
  • lazy=”no-proxy” means that the property is lazy-loaded, but without a proxy. From what I’ve seen, here the lazy property iself is manipulated on the owner object so that it facilitates on-access loading. In any case, there’s no proxy and no duplicate instances. This feature is currently (in v3.0.0) buggy and it seems to work the same as the first option (lazy=”false”), just as it did in 2.0 when it was unsupported.

Each of the options has its bad sides: with proxies, you get duplicate objects and must make everything virtual on your data classes. In the non-lazy option, for each eager property a join is usually added in the sql query so that the property’s values are loaded at the same time, and the same goes for the property’s properties etc. Even worse, HQL queries don’t respect this joining method, they load the main table in one SQL query and then execute an additional query or two (or dozen) for each record to collect its related data – this is called the “N+1 selects” problem. Needles to say, using HQL for such queries is madness: in these cases, it is best to switch to Criteria API which does the joins properly.

And the bad sides of the no-proxy option? It doesn’t work… Other than that, it seems the perfect solution: no joins, no duplicate objects. If you ask me, I don’t want my data to be loaded on-demand at all. I don’t want the application to decide when it will load its data: if I fill a datagrid with one hundred objects and then the grid triggers lazy-loading on each of the objects in turn, this will create chaos. No, I want the application to break if it accesses data that was not explicitly loaded. But with the current implementation I have no choice: it’s either eager or proxy, and I’m choosing eager, for better or worse.

How about LINQ queries? In 2.x it was implemented over the Criteria API which means it knew how to join-load additional records. Not so in 3.x: now it behaves like HQL, N+1 selects all over the place.

So, what is there to do? It seems the only option left is to write all queries with explicit fetch statements for every non-lazy property… This would definitely solve the execution performance issue, but development performance would suffer: if I add a new non-lazy property, I’d have to rewrite all queries where it appears.

Ok, but if we’re using LINQ, it’s a dynamical query, right? It can be modified to include all required fetches. After some research, it turns out that there’s a solution that (at least on the outside) looks even elegant: use an extension method to do this. So, you would do something like:

session.Query<Person>().Where(…).EagerFetchAllNonLazyProperties()

or:

(from p in session.Query<Person> where … select …).EagerFetchAllNonLazyProperties()

Here’s one way this method could be implemented. Note that this is a somewhat hacked implementation and that there are probably some unsupported cases – one thing that is suspicious to me is that Criteria API joined the eager properties recursively while only the first level is covered here, so be careful. But it’s a good start... Preliminary tests were very promising ;).

public static IQueryable<TOriginating> EagerFetchAll<TOriginating>
  (this IQueryable<TOriginating> query)
{
  // hack the session reference out of the provider - or is
  // there a better way to do this?
  ISession session = (ISession)typeof(NhQueryProvider)
    .GetField("_session", System.Reflection.BindingFlags.Instance
      | System.Reflection.BindingFlags.NonPublic)
    .GetValue(query.Provider);

  IClassMetadata metaData = session.SessionFactory
    .GetClassMetadata(typeof(TOriginating));

  for(int i = 0; i < metaData.PropertyNames.Length; i++)
  {
    global::NHibernate.Type.IType propType = metaData.PropertyTypes[i];

    // get eagerly mapped associations to other entities
    if (propType.IsAssociationType && propType.IsEntityType
      && !metaData.PropertyLaziness[i])
    {
      ParameterExpression par = Expression.Parameter(typeof(TOriginating), "p");

      Expression propExp = Expression.Property(par, metaData.PropertyNames[i]);

      Expression callExpr = Expression.Call(null,
        typeof(EagerFetchingExtensionMethods).GetMethod("Fetch")
          .MakeGenericMethod(typeof(TOriginating), propType.ReturnedClass),
        // first parameter is the query, second is property access expression
        query.Expression, Expression.Lambda(propExp, par)
      );

      LambdaExpression expr = Expression.Lambda(callExpr, par);

      Type fetchGenericType = typeof(NhFetchRequest<,>)
        .MakeGenericType(typeof(TOriginating), propType.ReturnedClass);
      query = (IQueryable<TOriginating>)Activator.CreateInstance
        (fetchGenericType, query.Provider, callExpr);
    }
  }

  return query;
}

Subscribe to this RSS feed