Running with Code Like with scissors, only more dangerous


Exploring .dbc files with Dynamic Code Generation, part 4: Optimizing string reads

Posted by Rob Paveza

So far, we've written a simple parser for the .dbc file format. I've outlined that the .db2 file format is the same in principle, primarily different in the header format.

We do know that premature optimization is the root of all evil. However, I'm going to wager that this optimization is not premature. Let's consider Item-sparse.db2 for a moment; it has 101 columns and a string table that is 2,762,301 bytes long.

Recall that a string-typed column in .dbc is stored as an integer offset into the string block, a region of UTF-8 encoded strings at the end of the table. This is nice for a lot of reasons; the tables can be read very fast because rows are all the same size, strings can be efficiently localized, and the strings don't need to be decoded until they're needed. This last optimization is the one we're going to look at today.

Delay-loading the strings is a nice optimization. UTF-8 strings, when they contain non-US characters, do not have uniform character lengths. Reading them inline while we read the tables is likely to cause many processor cache misses. How can we optimize? By bypassing them altogether, of course. The simplest way would be to create a property such as "TitleStringOffset" instead of "Title" for these properties, call them integers, and expect the user to call DbcTable.GetString. That would be a fine approach, but in my opinion, would leak implementation details to the user.

Instead, let's wrap that information into an object - call it DbcStringReference - and allow that string to be retrieved later. What would we need in order to retrieve it later? A reference to the DbcTable that produced it, and the integer offset into the string table. Storing the reference to the DbcTable would keep the DbcTable from being garbage-collected as long as I held any of my entities, so we'll use a WeakReference instead.

public class DbcStringReference
        private WeakReference<DbcTable> _owner;
        private int _pos;
        private Lazy<string> _val;

        internal DbcStringReference(DbcTable owner, int position)
            if (owner == null)
                throw new ArgumentNullException("owner");
            if (position < 0)
                throw new ArgumentException("position");

            _owner = new WeakReference<DbcTable>(owner);
            _pos = position;
            _val = new Lazy<string>(() =>
                DbcTable table;
                if (!_owner.TryGetTarget(out table))
                    throw new ObjectDisposedException("DbcTable");

                return table.GetString(_pos);

        public override string ToString()
            return _val.Value;

There really isn't anything stellar here. We validate the constructor arguments, take a weak reference to the owner table, and then create a Lazy<string> which tries to resolve the strong reference, throws if that fails, then returns the string. Once the string has actually been retrieved, it is reused in future instances. Because the value of the string is exposed via the ToString override, string formatting methods like Console.WriteLine and string.Format automatically get the correct value (as do concatenation with strings and usage in UIs).

In order to add DbcStringReference to the list of supported types, we're going to need to make some modifications to our non-compiled and our compiled code sets. Fortunately, the ConvertSlow method itself doesn't need to change; all we need to change is the TargetInfo.SetValue<TTarget> method. Add this to the switch statement:

                    case TargetType.StringReference:
                        DbcStringReference sref = new DbcStringReference(table, inputVal);
                        SetValue(target, sref);

Don't forget to add a StringReference value to your TargetType enum:

        internal enum TargetType

And add in that return value to GetTargetTypeFromType:

        internal static TargetType GetTargetTypeFromType(Type type)
            if (type == typeof(int))
                return TargetType.Int32;
            if (type == typeof(float))
                return TargetType.Float32;
            if (type == typeof(DbcStringReference))
                return TargetType.StringReference;
            if (type == typeof(string))
                return TargetType.String;

            throw new InvalidDataException("Invalid data type.");

Let's force slow mode on. Here's a record definition for item-sparse.db2. (Yes, this is code-generated; I found the updated source).

public class Itemsparsedb2

        /// <summary>ID</summary>
        public int ID;

        /// <summary>Quality</summary>
        public int Quality;

        /// <summary>Flags</summary>
        public int Flags;

        /// <summary>Flags2</summary>
        public int Flags2;

        /// <summary>Column4</summary>
        public int Column4;

        /// <summary>Column5</summary>
        public float Column5;

        /// <summary>Column6</summary>
        public float Column6;

        /// <summary>Column7</summary>
        public int Column7;

        /// <summary>Price</summary>
        public int Price;

        /// <summary>SellPrice</summary>
        public int SellPrice;

        /// <summary>Column10</summary>
        public int Column10;

        /// <summary>Column11</summary>
        public int Column11;

        /// <summary>Column12</summary>
        public int Column12;

        /// <summary>ItemLevel</summary>
        public int ItemLevel;

        /// <summary>Column14</summary>
        public int Column14;

        /// <summary>Column15</summary>
        public int Column15;

        /// <summary>Column16</summary>
        public int Column16;

        /// <summary>Column17</summary>
        public int Column17;

        /// <summary>Column18</summary>
        public int Column18;

        /// <summary>Column19</summary>
        public int Column19;

        /// <summary>Column20</summary>
        public int Column20;

        /// <summary>Column21</summary>
        public int Column21;

        /// <summary>Column22</summary>
        public int Column22;

        /// <summary>Column23</summary>
        public int Column23;

        /// <summary>Column24</summary>
        public int Column24;

        /// <summary>Column25</summary>
        public int Column25;

        /// <summary>Column26</summary>
        public int Column26;

        /// <summary>Column27</summary>
        public int Column27;

        /// <summary>Column28</summary>
        public int Column28;

        /// <summary>Column29</summary>
        public int Column29;

        /// <summary>Column30</summary>
        public int Column30;

        /// <summary>Column31</summary>
        public int Column31;

        /// <summary>Column32</summary>
        public int Column32;

        /// <summary>Column33</summary>
        public int Column33;

        /// <summary>Column34</summary>
        public int Column34;

        /// <summary>Column35</summary>
        public int Column35;

        /// <summary>Column36</summary>
        public int Column36;

        /// <summary>Column37</summary>
        public int Column37;

        /// <summary>Column38</summary>
        public int Column38;

        /// <summary>Column39</summary>
        public int Column39;

        /// <summary>Column40</summary>
        public int Column40;

        /// <summary>Column41</summary>
        public int Column41;

        /// <summary>Column42</summary>
        public int Column42;

        /// <summary>Column43</summary>
        public int Column43;

        /// <summary>Column44</summary>
        public int Column44;

        /// <summary>Column45</summary>
        public int Column45;

        /// <summary>Column46</summary>
        public int Column46;

        /// <summary>Column47</summary>
        public int Column47;

        /// <summary>Column48</summary>
        public int Column48;

        /// <summary>Column49</summary>
        public int Column49;

        /// <summary>Column50</summary>
        public int Column50;

        /// <summary>Column51</summary>
        public int Column51;

        /// <summary>Column52</summary>
        public int Column52;

        /// <summary>Column53</summary>
        public int Column53;

        /// <summary>Column54</summary>
        public int Column54;

        /// <summary>Column55</summary>
        public int Column55;

        /// <summary>Column56</summary>
        public int Column56;

        /// <summary>Column57</summary>
        public int Column57;

        /// <summary>Column58</summary>
        public int Column58;

        /// <summary>Column59</summary>
        public int Column59;

        /// <summary>Column60</summary>
        public int Column60;

        /// <summary>Column61</summary>
        public int Column61;

        /// <summary>Column62</summary>
        public int Column62;

        /// <summary>Column63</summary>
        public int Column63;

        /// <summary>Column64</summary>
        public int Column64;

        /// <summary>Column65</summary>
        public int Column65;

        /// <summary>Column66</summary>
        public int Column66;

        /// <summary>Column67</summary>
        public int Column67;

        /// <summary>Column68</summary>
        public int Column68;

        /// <summary>Column69</summary>
        public int Column69;

        /// <summary>Name</summary>
        public string Name;

        /// <summary>Name2</summary>
        public string Name2;

        /// <summary>Name3</summary>
        public string Name3;

        /// <summary>Name4</summary>
        public string Name4;

        /// <summary>Description</summary>
        public string Description;

        /// <summary>Column75</summary>
        public int Column75;

        /// <summary>Column76</summary>
        public int Column76;

        /// <summary>Column77</summary>
        public int Column77;

        /// <summary>Column78</summary>
        public int Column78;

        /// <summary>Column79</summary>
        public int Column79;

        /// <summary>Column80</summary>
        public int Column80;

        /// <summary>Column81</summary>
        public int Column81;

        /// <summary>Column82</summary>
        public int Column82;

        /// <summary>Column83</summary>
        public int Column83;

        /// <summary>Column84</summary>
        public int Column84;

        /// <summary>Column85</summary>
        public int Column85;

        /// <summary>Column86</summary>
        public int Column86;

        /// <summary>Column87</summary>
        public int Column87;

        /// <summary>Column88</summary>
        public int Column88;

        /// <summary>Column89</summary>
        public int Column89;

        /// <summary>Column90</summary>
        public int Column90;

        /// <summary>Column91</summary>
        public int Column91;

        /// <summary>Column92</summary>
        public int Column92;

        /// <summary>Column93</summary>
        public int Column93;

        /// <summary>Column94</summary>
        public int Column94;

        /// <summary>Column95</summary>
        public int Column95;

        /// <summary>Column96</summary>
        public int Column96;

        /// <summary>Column97</summary>
        public int Column97;

        /// <summary>Column98</summary>
        public float Column98;

        /// <summary>Column99</summary>
        public int Column99;

        /// <summary>Column100</summary>
        public int Column100;

        /// <summary>Column101</summary>
        public int Column101;

Running this in our test harness with item-sparse.db2, enumerating all the records 5 times, this takes 173959ms. Replacing the five string properties with DbcStringReference reduces the time to 173150ms. Hardly any improvement! Now, let's add in dynamic compilation support.

Add this below the MethodInfo declarations in DbcTableCompiler:

private static ConstructorInfo DbcStringReference_ctor = typeof(DbcStringReference).GetConstructor(BindingFlags.NonPublic | BindingFlags.Instance, null, new Type[] { typeof(DbcTable), typeof(int) }, null);

Add this into the EmitTypeData in DbcTableCompiler:

                case TargetType.StringReference:
                    generator.EmitCall(OpCodes.Callvirt, BinaryReader_ReadInt32, null);
                    generator.Emit(OpCodes.Newobj, DbcStringReference_ctor);

And that's it. The test harness running with string properties takes 13715ms. But... the test harness running with DbcStringReference takes only 2896ms!

Why such a substantial difference? I don't know for sure, but I can make a supposition. The constructor reference for DbcStringReference can be inlined. That allows the processor cache to be highly efficient. Since we're not digging into the Encoding.GetString method, we're saving even more time. Of course, we don't actually have a copy of the string yet, but if I'm looking for an item by its ID, I'm saving quite a bit of time.

So in summary:

  • String fields, no dynamic compilation: 173959ms
  • DbcStringReference fields, no dynamic compilation: 173150ms, 0.5% improvement
  • String fields, no dynamic compilation: 13715ms, 92.1% improvement
  • DbcStringReference fields, dynamic compilation: 2896ms, 98.3% improvement

Next time? Not sure yet. I'll get back to you.

Filed under: .NET, Programming No Comments

Exploring .dbc files with Dynamic Code Generation, part 3: Dynamic codegen!

Posted by Rob Paveza

Last time, we talked about how to populate a class based on its ordered fields, using Reflection to do the heavy lifting for us. This time, we're going to generate a DynamicMethod which runs at runtime in the same way that ConvertSlow did last time. Let's look at that this time.

First, let's change GetAt here:

        public T GetAt(int index)
            if (_store == null)
                throw new ObjectDisposedException("DbcTable");

            _store.Seek(_perRecord * index + _headerLength, SeekOrigin.Begin);

            T target = new T();
            // This is the only change
            Convert.Value(_reader, _recordLength, this, target);
            return target;

The only change here is that we've changed ConvertSlow to Convert.Value. The Convert field is a Lazy<DbcReaderProducer>; DbcReaderProducer is a delegate type that corresponds to the same signature as the ConvertSlow method:

internal delegate void DbcReaderProducer<T>(BinaryReader reader, int fieldsPerRecord, DbcTable table, T target)
        where T : class;

We're going to aim for creating DynamicMethods that have this signature. The Convert field then gets defined as follows:

        private static Lazy<DbcReaderProducer<T>> Convert = new Lazy<DbcReaderProducer<T>>(() =>
            if (Config.ForceSlowMode)
                return DbcTable<T>.ConvertSlow;

                return DbcTableCompiler.Compile<T>();
                if (Debugger.IsAttached && !Config.ForceSlowMode)

                return DbcTable<T>.ConvertSlow;

So the magic is all in the DbcTableCompiler. Let's take a look at how that will work. Before writing code, let's outline it, using the CharTitles.dbc file we wrote in Part 1:

public class CharacterTitleRecord
    public int ID;
    public int RequiredAchievementID;
    public string Title;

Were I writing a DbcReaderProducer<CharacterTitleRecord> by hand, it might look like this:

static void FillCTR(BinaryReader reader, int fieldsPerRecord, DbcTable table, CharacterTitleRecord target)
    // assumption: reader is already at the beginning of the record
    target.ID = reader.ReadInt32();
    target.RequiredAchievementID = reader.ReadInt32();
    int stringTablePos = reader.ReadInt32();
    target.Title = table.GetString(stringTablePos); // Note: Table.GetString restores the current Stream position
    reader.ReadInt32(); // unused column @ index 3
    reader.ReadInt32(); // unused column @ index 4
    reader.ReadInt32(); // unused column @ index 5

That's pretty straightforward, right? Let's break down how that compiles.

  • I have to call reader.ReadInt32() (or ReadSingle if it's a float-type parameter). That puts the return value on the evaluation stack.
  • If the field type is String, call table.GetString.
  • Store the value onto the target's field.

Before we go on, we need to talk about the .NET execution/evaluation stack. This execution stack is a virtual stack that doesn't necessarily have any relationship to the physical hardware; it's just conceptually how the CLR reasons about the data that it's processing. For example, suppose you have an assignment x = a + b;. The .NET stack would modify in the following ways:

  1. push a
  2. push b
  3. add (consumes the top two values on the stack, and pushes the result onto the stack)
  4. assign to 'x' (consumes the top value on the stack)

In this way, the stack remains balanced. You can read more about how the CLR runtime environment works here. The other thing to note is that the CLR follows a single calling convention. If a method being called is an instance method, then the instance reference is pushed onto the stack followed by the parameters in left-to-right order; if it is a static method, then the instance reference is skipped. (Variadic functions have their arguments converted to an array at the call site).

Knowing this, we should be able to compile our own implementation as long as we know the right IL opcodes. I do; let's compile the above function now.

static void FillCTR(BinaryReader reader, int fieldsPerRecord, DbcTable table, CharacterTitleRecord target)
    ldarg.3  // push 'target' onto the stack
    ldarg.0  // push 'reader' onto the stack
    callvirt (instance method) BinaryReader.ReadInt32(void) // consumes 'reader', pushes result
    stfld (instance field) CharacterTitleRecord.ID // consumes 'target' and result of previous, assigns value

    callvirt (instance method) BinaryReader.ReadInt32(void)
    stfld (instance field) CharacterTitleRecord.RequiredAchievementID

    ldarg.2 // push 'table' onto the stack
    callvirt (instance method) BinaryReader.ReadInt32(void)
    callvirt (instance method) DbcTable.GetString(int32) // consumes 'table' and the result of ReadInt32
    stfld (instance field) CharacterTitleRecord.Title 

    callvirt (instance method) BinaryReader.ReadInt32()
    pop // unused column @ index 3

    callvirt (instance method) BinaryReader.ReadInt32()
    pop // unused column @ index 4

    callvirt (instance method) BinaryReader.ReadInt32()
    pop // unused column @ index 5


Woohoo! Now THAT is the kind of thing that seems highly automatable. Let's start thinking about how we can implement this. Starting with the actual Compile method:

        internal static DbcReaderProducer<T> Compile<T>()
            where T : class
            // Supposing <T> is "DbcReader.CharacterTitleRecord", this method 
            // creates a DynamicMethod named $DbcTable$DbcReader$CharacterTitleRecord, with the same access
            // to types as the DbcTableCompiler type (i.e., internal to this assembly.
            // It returns void, and accepts BinaryReader, int, DbcTable, and T as its arguments.  In other words, 
            // it is compatible with DbcReaderProducer<T>.
            DynamicMethod method = new DynamicMethod("$DbcTable$" + Regex.Replace(typeof(T).FullName, "\\W+", "$"), typeof(void), new Type[] { typeof(BinaryReader), typeof(int), typeof(DbcTable), typeof(T) }, typeof(DbcTableCompiler).Assembly.ManifestModule);
            ILGenerator gen = method.GetILGenerator();

            var properties = GetTargetInfoForType(typeof(T)); // Same method as before
            var propertyMap = properties.ToDictionary(ti => ti.Position);
            var maxPropertyIndex = propertyMap.Keys.Max(); // We go in order, unlike the naïve implementation
            for (int i = 0; i <= maxPropertyIndex; i++) 
                TargetInfo info;
                if (propertyMap.TryGetValue(i, out info))
                    EmitForField(info, gen); // We have to do our magic here.  Allow us to grow to support properties later.
                else // unused column below
                    gen.EmitCall(OpCodes.Callvirt, BinaryReader_ReadInt32, null);


            return method.CreateDelegate(typeof(DbcReaderProducer<T>)) as DbcReaderProducer<T>;

All told, that is absolutely not complex nor scary. EmitForField is also not terribly complex:

        private static void EmitForField(TargetInfo info, ILGenerator generator) 
            Debug.Assert(info != null);
            Debug.Assert(generator != null);
            Debug.Assert(info.Field != null);


            EmitTypeData(info, generator);

            generator.Emit(OpCodes.Stfld, info.Field);

The EmitForField method is designed as it is to allow us to grow to support properties later. (That will require pushing a reference to the 'target' object before doing the type-specific mutations and then calling the property setter, which is a slightly different activity than this one). So let's look at EmitTypeData, which does the real heavy lifting:

        private static MethodInfo BinaryReader_ReadInt32 = typeof(BinaryReader).GetMethod("ReadInt32");
        private static MethodInfo BinaryReader_ReadSingle = typeof(BinaryReader).GetMethod("ReadSingle");
        private static MethodInfo DbcTable_GetString = typeof(DbcTable).GetMethod("GetString");

        private static void EmitTypeData(TargetInfo info, ILGenerator generator)
            switch (info.Type)
                case TargetType.Float32:
                    generator.EmitCall(OpCodes.Callvirt, BinaryReader_ReadSingle, null);
                case TargetType.Int32:
                    generator.EmitCall(OpCodes.Callvirt, BinaryReader_ReadInt32, null);
                case TargetType.String:
                    generator.EmitCall(OpCodes.Callvirt, BinaryReader_ReadInt32, null);
                    generator.EmitCall(OpCodes.Callvirt, DbcTable_GetString, null);
                    throw new NotSupportedException("Invalid type for target property.");

Looks suspiciously similar to what we did before! In fact, we can now see each of the code paths, plus an extra code path for 32-bit floats. So that's pretty set to go.

What does this do for us? Iterating through the CharTitles table 5 times goes from 37ms to 17ms on my laptop when not plugged in; iterating through the ItemDisplayInfo table 5 times goes from 880ms to 240ms. That's not a bad gain, especially with no optimizations like seeking!

Next time: We're going to optimize string reads.


Exploring .dbc files with C# Dynamic Code Generation, part 2: The naive implementation

Posted by Rob Paveza

Last time, we defined the problem of viewing a DBC file from World of Warcraft and created the shape of an API we'd like to use for that purpose. This time, we'll implement a basic parser which uses reflection but no dynamic code generation to populate the individual records.

Let's make some assumptions about where we are:

  • A DBC file is accessible via a random-access stream
  • We're only handling int, float, and string. The others are variations of int.

Here's a quick enumeration outlining the different types that we might set:

        internal enum TargetType

In DbcReader, I actually supported both read-write properties and fields; but, for simplicity, we'll just support fields. For the most part, they work the same. We're going to create a simple type that keeps track of the relevant information -- that is, the Reflection FieldInfo of the field, the column Position of the field in the row, and the TargetType.

        internal class TargetInfo
            public FieldInfo Field;
            public int Position;
            public TargetType Type;

            // inputVal is the int that was 4-byte value read from the underlying stream
            // for this column.
            public void SetValue<TTarget>(TTarget target, int inputVal, DbcTable table)
                where TTarget : class
                switch (Type)
                    case TargetType.Int32:
                        SetValue(target, inputVal);
                    case TargetType.Float32:
                        byte[] bits = BitConverter.GetBytes(inputVal);
                        SetValue(target, BitConverter.ToSingle(bits, 0));
                    case TargetType.String:
                        string tmp = table.GetString(inputVal);
                        SetValue(target, tmp);

            public void SetValue<TValue, TTarget>(TTarget target, TValue inputVal)
                where TTarget : class
                Field.SetValue(target, inputVal);

This class provides a convenient way to set the value of a field in a TTarget entity. So how is this used? From within a DbcTable<T>, this produces a record:

        private static void ConvertSlow(BinaryReader reader, int fieldsPerRecord, DbcTable table, T target)
            int[] values = new int[fieldsPerRecord];
            for (int i = 0; i < fieldsPerRecord; i++)
                values[i] = reader.ReadInt32();

            Type t = typeof(T);
            foreach (var targetInfo in DbcTableCompiler.GetTargetInfoForType(t))
                targetInfo.SetValue(target, values[targetInfo.Position], table);

The only thing that needs to be implemented from this point is the GetTargetInfoForType function. It isn't terribly complex; it's just enumerating the fields. Because we enter this function with the int[] of all columns, the array is random-access, so ordering isn't particularly important of how the fields get enumerated.

        internal static IEnumerable<TargetInfo> GetTargetInfoForType(Type type)
            // Get all public instance fields
            foreach (FieldInfo fi in type.GetFields(BindingFlags.Public | BindingFlags.Instance))
                // Get DbcRecordPosition attributes but don't inherit
                var attr = fi.GetCustomAttribute<DbcRecordPositionAttribute>(false);
                if (attr != null)
                    var result = new TargetInfo
                        Field = fi,
                        Position = attr.Position,
                        Type = GetTargetTypeFromType(fi.FieldType),
                    yield return result;

This is a simple enumerator that looks at each field on a class which is public, then yields it to the coroutine that's invoked the enumeration.

So how does this all fit together?

We have a class DbcTable and child class DbcTable<T>. The DbcTable owns and operates the underlying Stream and a BinaryReader that lives atop the Stream. The ConvertSlow function manages all of that work, so the implementation of GetAt (which provides the record at a specified index) is very simple:

        public T GetAt(int index)
            if (_store == null)
                throw new ObjectDisposedException("DbcTable");

            // _store is the Stream
            _store.Seek(_perRecord * index + _headerLength, SeekOrigin.Begin);

            T target = new T(); // DbcTable<T> must be declared as where T: class, new()
            ConvertSlow(_reader, _recordLength, this, target);
            return target;

That's all there is to it! Next time, we'll actually start getting into dynamic code generation. We're going to cheat and write it by hand, then reverse engineer that out.

Filed under: .NET, Programming No Comments

Exploring .dbc files with C# Dynamic Code Generation, part 1: Defining the problem

Posted by Rob Paveza

You know, I look back at my blog after all these years and the particularly infrequent updates and reflect a little bit on just how much things have changed for me. I know that right now I want to be writing code using back-ticks because I'm so accustomed to writing Markdown. But that's neither here nor there.

I recently published a project called DbcExplorer on GitHub. This was just a pet project I'd been working on during the World of Warcraft: Mists of Pandaria timeline; I'd just joined Microsoft and had my Windows Phone, but there's no Mobile Armory on Windows Phone (or even Big Windows, for that matter). A little bit of background: World of Warcraft stores simple databases in files with an extension of .dbc or .db2; these databases allow rapid lookup by ID or simple fast enumeration. There are a myriad number of them, and they commonly change from version to version. The reason I wanted them was to be able to crawl item information and achievement information for the purpose of creating a miniature Mobile Armory for Windows Phone, that could at least tell you which achievements you were lacking, and people could vote on which achievements were easiest, so that you could quickly boost your score.

(Side note: When Warlords of Draenor was released, Blizzard changed their storage archive format from MPQ to CASC. Ladislav Zezula, who created StormLib, which was a C library for accessing MPQ files, had made some progress at the time at CASC as well. However, I couldn't get it to work at the time, so I stopped working on this project. Ladik and I recently figured out what the disconnect was, and I've now wrapped his CascLib into CascLibSharp, but I don't know that I'll be resurrecting the other project).

Anyway, DBC files are pretty easy. They have a header in the following form:

uint32        Magic 'WDBC', 'WDB2', or 'WCH2'
uint32        Number of records
uint32        Number of columns per record
uint32        Number of bytes per record (always 4x # of columns as far as I can tell)
uint32        String block length

The files that aren't of type 'WDBC' have a few additional fields, but the general structure is the same. The files then have the general form:

DbcHeader     Header
Record[Count] Records
uint8         0  (Start of string table, a 0-length string)
uint8[]       String block (UTF-8 encoded, null-terminated strings)

Each column is one of:

  • Int32
  • Float32
  • String (an int32-offset into the String Table)
  • "Flags" (a uint32 but usually has a fixed set of bit combinations)
  • Boolean (just 0 or 1)

So this pretty well defines the problem space. We need to support deserializing from this binary format into plain objects, so that I can say I have a DbcTable<T>, and my runtime will be able to enumerate the records in the table. Now, because the CLR doesn't guarantee how the properties on objects will be enumerated (at least to the best of my knowledge); it probably keeps a consistent order based on some ethereal thing, but I don't know what that order is based on, so before I go, I probably have to do something.

Briefly, let's look at DBFilesClient\CharTitles.dbc. This file (at least as of the most recent patch) has six columns. I don't know for sure, but it looks like the following:

Column    Type       Description
0         Int32      ID
1         Int32      Required achievement ID
2         String     Title
3         String     Title, repeated
4         Int32      Unknown, just seems to continuously increase
5         Int32      Reserved (all records have 0)

Since I don't know what to do with columns 3-5, I can just define the following class:

public class CharacterTitleRecord
    public int ID;
    public int RequiredAchievementID;
    public string Title;

Next time: We'll see how the naïve implementation deserializes each record.