IExtendable while (this.IsAlive) self => self.UpdateAll()

1Mar/150

Richmond 0.3 released

Release Notes

0.3.0

  • Richmond was failing on binary files. Richmond can now be configured to ignore
    or target specific files or directories via a .richmond file found in the root
    of the directory being processed.

Configuration

Richmond will look for a file called .richmond in the root directory you are running against.
If it finds the .richmond file, it will load it.

You can use the select or reject methods to change which files Richmond will include or exlude
in it's processing.

# Richmond doesn't like binary files
Richmond.reject do |file|
  file.match /images/
end

# I only want to parse .rb files
Richmond.select do |file|
  file.match /\.rb$/
end
Tagged as: , , No Comments
18Feb/150

My First Gem – Richmond

The Context

I'm building a Ruby API in Sinatra and publishing it's documentation using Swagger. I didn't really like any of the tools to generate swagger docs from the Ruby code so at first I was handwriting the JSON files myself. Then I decided that it would be easier to manage in YAML.

YAML was definitely a better solution than handwriting the JSON, but I was still wishing that the documentation for the models and API's were embedded in the code they described.

The Solution

I decided that the documentation would live with the code come hell or high water. I figured I could use Ruby's block comments with a little bit of special formatting to identify documentation sections throughout the code base and lace them all together into one or more output files. The result is Richmond (named after the character in the IT Crowd who would have benefitted from some documentation telling him what all the blinking lights did).

Usage

After you install the gem, you can execute the gem from the command-line like so:

richmond /dir/to/scan

You can find the code for richmond on github, as well as additional documentation.

30Jan/150

Installing an Optimal Vim Experience on Windows

Vim

I'm assuming you'd rather not compile Vim for Windows yourself. I don't blame you. I tried it and it's a nightmare.

These are manual steps, unfortunately. :(

For the best vim experience you should get the version compiled with Ruby and Python support.

Ruby and Python support does not necessarily mean that you will be coding in Ruby or Python, but many of the plugins that you will want to use require vim integration with Ruby and Python.

These instructions assume you are working in a Powershell console and that Chocolatey is installed.

Install Ruby

choco install ruby -Version 2.0.0.59800

If you need other version of ruby, go ahead and install them now.

Uru

You only need to install uru if you are using multiple rubies. Rvm does not work on Windows and Pik is no longer supported. Uru works fine.

Install Uru

You'll need to add your ruby installations to uru using

uru admin add /path/to/ruby/bin

Installing Python

choco install Python2

Install Vim

Now you are ready to install vim.
Unfortunately, the version available on Chocolately.org is hard to get to work with Ruby and Python on Windows.
However, Alexander Shukaev (Haroogan) has compiled a version that works nicely.

Download Vim from Here

Extract the zip file where you want it and make sure the location of vim.exe is in the path before any other vims on your system (for example, if you have msysgit installed).

Verification

If this works, you should be able to execute the following commands in vim.

:echo has('ruby') => 1
:echo has('python') => 1

Install .vimrc

I keep my .vimrc file as a gist.

I clone it into my C:\git directory.

git clone https://gist.github.com/crmckenzie/4913add34cd30abd4b93 vimrc

Then I create a symlink from my $HOME directory to the vimrc file. This allows me to maintain the file across machines using git as a synchronization tool.

From powershell

cmd /C mklink .vimrc C:\git\vimrc\.vimrc

The "cmd /C" section is necessary in Powershell because not all cmd.exe commands have been ported yet.

Install Vundle

Vundle is a package manager for Vim. It uses github as a package source. This is about as easy as it gets :)

git clone https://github.com/gmarik/Vundle.vim.git ~/.vim/bundle/Vundle.vim

Install Bundles

Open vim and run

:BundleInstall

This should download the bundles specified in .vimrc

I'll maintain this documentation in my configure-win-dev-workstation repo.

26Oct/140

What We Can Learn From Usability Failures in Destiny

Destiny is a great game. It is getting a lot of complaints due to some obvious failures such as not having enough content, lacking LFG or match-making mechanics for the more difficult missions, and not having a lot of character customization options. These are all true and valid criticisms, but I still find myself playing the game a LOT.

Recently I've found myself focusing on another kind of failure in the game. These are user-experience failures. I thought that it might be fun and useful to clearly identify them because some of the principles involved may be useful in other kinds of software.

Orbit

Destiny is constructed as a series of play areas where you can do different things. There are areas with missions and enemies (Venus, Earth, the Moon, and Mars), a safe-zone on Earth called The Tower in which you can trade with merchants and receive bounties, and Orbit where you can see all these areas and decide where you want to go next.

Orbit is useless.

It should not be in the game.

Orbit basically lets me look at a map of places to go, select one, and go there. This would be fine if I didn't have to pay the cost of load-time in order to transition from one area to another. If I'm on Earth and I want to turn in some bounties and then go to Mars, I have to:

1) go to orbit (load time)

2) go to the Tower (load time)

3) go back to orbit (load time)

4) go to Mars (load time)

Instead I should be able to:

1) go to the Tower (load time)

2) Travel to Mars (load time)

In fact, I should be able to travel directly from any one area to any other area without having to load a special context in order to see a map. If I want to skip the Tower, I should just be able to:

1) go to Mars (load time)

Why is orbit even a thing? Why can't I access the map from any location? Why can't I stay in the world I'm in after a mission?

The Sin: Wasting User's Time

Destiny shoehorns the player into a workflow that doesn't make sense for the player and wastes their time. If there is a technical reason for this workflow, Bungie should solve the underlying problem in such a way that the user's play flow is streamlined.

Joining and Leaving a Fireteam.

Let's say I'm in the Tower and I'm invited to join a fire team. Let's stipulate further that the fireteam is also in the Tower. The game

1) immediately takes me to orbit

2) and takes me back to the Tower.

WTF?

Likewise, if I'm in a fire team and I leave it the game almost always takes me out of where I am and sends me to Orbit (which as I've already stated is useless).

The problem here is likely technical (but it's solvable). I imagine that the issue is that I'm in the The Tower on a different server than the rest of the fireteam. Bungie has chosen to solve this problem by taking me out of the current server and completely reloading a new context on the new server with the same fireteam. Other MMO's have faced this problem and solved it without forcing the player to reload the same environment they're already in.

The Sin: Wasting User's Time

Destiny is forcing the player to endure the consequences of their bad design decisions.

Cut Scenes

OMFG I can't believe I'm even having to say this so late in the development of video games. Unskippable cut-scenes are the 2nd to worst sin in games. The first is unskippable cut scenes that start immediately before an epic battle which will likely kill the hero many times before the player succeeds. Having to watch the same cut-scene over and over between attempts frustrates the player and drains enjoyment out of the game-playing experience. Some games are so bad with this that I have stopped playing them altogether.

Attention Game Developers--I'm interested and willing to watch your cut-scenes once. I know you put a lot of work into them and want the players to see them and enjoy them. However, the game is supposed to be fun, and watching the same @$#*$_)(*!@&)(*&$ movie over and over again makes the game not fun. Please for the love of all that is bright and good in the world, stop!

If I was a professional game reviewer, I would immediately dock 20% off of the game score for doing this once. In Destiny, no cut scenes are skippable. Gah!

The Sin: Wasting User's Time

Failing to understand what the player wants out of your game and instead pushing your own agenda on the player leaves a bad taste in the player's mouth.

Summary

Hmm, it looks like all of these problems are a variant of the same thing. The user's time is valuable and the workflows you design for your software should be designed to get user's where they want to go as fast as possible.

Destiny is a great game. It's a well-executed MMOFPS and I'm still enjoying playing it. However, it has some serious warts. All it will take is a more competent competitor to enter this space to get me to play (and recommend to my friends) something else.

Filed under: Uncategorized No Comments
13Sep/141

Seattle Code Camp 2014

I gave 2 presentations at Seattle Code Camp today.

The first was a talk about our internship program. I'm still trying to start the conversation on this one. So far it appears no one is talking about this topic. My powerpoint is here: Scaling Craftsmanship Through Apprenticeship. It's not much more than just a memory-jog for me, but they were asked for so I'm posting them.

The second talk was about Unit Testing Your Javascript. This one sort of went sideways when Chrome refused to load my demo site. That was a challenge! Still, I think it went off okay. The unfortunate reality is that unit testing in Javascript is still pretty hard.  It still feels like a lot of duct tape and baling work to make it work. That said, I've put together a demo application that shows how we do it at work.

Thanks to everyone who attended my presentations. I hope you enjoyed them!

Filed under: Uncategorized 1 Comment
17Jan/140

Running a Software Development College Internship

Introducing an college internship program into a development organization can be a difficult challenge. There are precious little resources available on the topic. Here are some thoughts I’ve put together after working with interns over the last year and a half.

What Does Your Department Want?

Deciding what value you hope to gain will go a long way toward helping you decide what the features of a good internship program should be. From the intern’s perspective, they want an opportunity to learn, to network, and to gain experience. If you don’t have anything that you want out of offering an internship, it will be hard for you to run it effectively.

For our part, we want to expand our existing “learning organization” culture. We regard our interns as potential sources of hiring. We want to improve the overall quality of software in general by teaching new programmers some of our hard-won knowledge of principles, patterns, and practices. We want to aid our business by making interns available to work on small-scale projects that almost never get prioritized.

Establish a Primary and Secondary Mentor

Each intern should have a primary mentor who is responsible for directing their day-to-day activities. In addition, each intern should have at least one adjunct mentor who checks in with them each week to see how they are doing. Often a different ear will hear things that the primary mentor will not, such as if the primary mentor is moving too fast or too slow, or if a different teaching technique would be helpful. This feedback is important to gain early so that the learning process can be tailored to the individual intern.

Patience

As a mentor and trainer to interns you will be confronted with the sheer number of concepts, processes, tools, techniques that you simply take for granted. In our shop interns have routinely never seen source control, build servers, unit testing, code bases with more than a few hundred lines, third party tools such as ReSharper, third party libraries such as jQuery, and project management tools. We have to teach them all of those things before they can be productive in our business.

In its current incarnation, our interns spend 20+ hours per week in the office. Below are the materials that I really want them to learn over the course of their internship. Time is short and there is a lot to learn, but the mentor should resist the desire to go too fast. I’m covering concepts with my interns that it took me 10 years to learn. I’m trying to introduce them to these concepts so that they understand them well enough to begin using them in just over a month. This is an enormous challenge for both the intern and the mentor. The mentor needs to remain patient and allow the intern time to digest and practice what has been covered so far before diving into the next subject.

Most importantly, do not push the interns to learn topics that build on other topics they do not yet understand. For example, don’t teach mocking tools before they’ve learned the basics of Test Driven Development.

Practice

Interns need time to practice the material you are teaching them. Remember, you have been doing this for years. They’ve been doing this for days. The difference is important. Given them breakable toys to work on. Teach them the concept. Pair with them to practice the technique the first time. Follow up with a different practice exercise for them to do on their own to be sure they have it. If you can’t find a good practice exercise, create one. If the intern does not seem to be “getting it” from you, consider having the intern work with someone else for the material in question. Sometimes an alternate perspective is all it takes to make the material sink in. If that still doesn’t work, move on to something else and come back to the material later.

Pairing

It can often be hard for interns to “get started” with whatever task they’re supposed to be practicing. For example, after reading about Test Driven Development, they may feel confident that they understand the concepts. In practice however, they are often unable to write their first tests. As the primary mentor, you will need to work directly with them to help them “get over the hump.” By pairing with them, you will see concretely what they do and do not understand. You should give broader context for what they do understand, and direct their learning and practice toward what they do not understand.

Training

Learning is what the intern is here for, but where do you start? How do you work with someone with no prior work experience? How do you teach them all those things you take for granted? We want to welcome new people into the software development industry, but we don’t want to sacrifice code quality.

I accomplish this in a couple of ways.

The first thing I do with a new intern is setup an online Kanban board with them. I’m using Trello right now. This is important to keep track of each intern’s progress. Further, the intern’s learning is a project in it’s own right, so it’s natural to treat it the way we treat other projects.

During the first month they are given the time and space they need to read, watch training videos, and practice what they’ve learned on breakable toys. I spend 1 to 3 hours per day working side-by-side with the intern during this period to guide their learning. The order of the learning is not terribly important, though there are logical dependencies between some concepts that need to be observed.

I require that all of their work is placed into source control. I want the intern to get comfortable with github right away as all work should be done using source control. Many interns have never worked with source control at all, so learning git right away can be challenging. A good exercise for learning github is to have them import their school projects into github. If the intern likes the command-line that’s fine, but I usually tell them about great GUI tools such as SourceTree. I find that the GUI really helps people new to Git grok the concepts. (I still use the GUI myself Winking smile .)

The other thing I think it’s important to discuss early is the value of clean code. Our interns are given copies of Clean Code and Clean Coder on their first day. Of all the books listed, this is the one it’s most important to me that they read. A recent idea I had about how to practice cleaning code is have the intern use their own school projects (now conveniently located in github) as refactoring exercises.

An intern is not going to master any of the content (that takes continued practice over years), but they will be much further ahead starting their career armed with a basic grasp of the principles, patterns, and practices that it took the rest of us years to learn. The learning should be aggressive and focused on making them productive in your environment as quickly as possible.

Real Code

It’s natural that interns would want to work on something useful for the business. The biggest bottlenecks to making that happen for us are process compliance, automated testing, their ability to navigate large projects, and knowledge of software architectural patterns. Our shop places a high value on automated testing for production code. TDD is difficult for experienced developers to learn, so asking an intern to do it on the first day might be a bit much. This is why we don’t expect useful code out of our interns in the first month. That time is to be spent learning.

After the first month we begin introducing them to code bases we intend to use. These code bases are typically larger than the code that they have been working with in school, so it’s important to spend some time teaching them how to navigate the projects. This time should include a discussion of the logical division of responsibilities inside the project as well as mundane items such as useful keyboard shortcuts in whatever developer tools you are using.

Internship Curriculum

Below is the basic curriculum I put interns through in their time with us. I encourage them to get through as much of the reading and training materials as possible. I start putting them on real code after the first month. I pair with them to find out what they do and do not understand. I do not push them to study topics that build on concepts they do not already grasp. This curriculum is tailored for our environment. We are a Kanban, C#, Continuous Integration shop that highly values clean code. Your curriculum should be tailored to the tools that you use every day in your work.

28Sep/130

Onion Architecture Presentation Resources

Thanks to everyone who attended my presentation at #SeattleCodeCamp this morning!

The code I demoed this morning is currently on the “develop” branch on github.

Here are some of the resources for further reading I alluded to.

I had some additional thoughts for future revisions of the presentation.

  1. OA is not just about coding to interfaces. It’s also about coding to the right category of interfaces.
  2. My presentation is in C#, but the same principles apply in any programming language. The implementation details in other languages may differ.
  3. Every recommended approach to Onion has a “fat” layer where the important code is. Other layers are basically façades and coordinators.
  4. In the future, I’d like to show my code diagram first, then show the code, then show the other Onion diagrams.
  5. I need to update my image for my code diagram as the font doesn’t show up well on washed out projectors.

I’d also like to find a DDD implementation of OA and at least one written in another language (not Java).

22Aug/130

Isg.EntityFramework 0.9.0 Released (Bug Fix)

Release Notes

  • Fixed a bug in TypeInterceptor in which IsTargetEntity() was not being called before passing the handling down the inheritance chain.
19Aug/130

Don’t Unit Test NHibernate: Use Generic Repository

I was reading this stack overflow question: How can I solve this: Nhibernate Querying in an n-tier architecture?

The author is trying to abstract away NHibernate and is being counseled rather heavily not to do so. In the comments there are a couple of blog entries by Ayende on this topic:

The false myth of encapsulating data access in the DAL

Architecting in the pit of doom the evils of the repository abstraction layer

Ayende is pretty down on abstracting away NHIbernate. The answers on StackOverflow push the questioner toward just standing up an in-memory Sqlite instance and executing the tests against that.

The Sqlite solution is pretty painful with complex databases. It requires that you set up an enormous amount of data that isn’t really germane to your test in order to satisfy FK and other constraints. The ceremony of creating this extra data clutters the test and obscures the intent. To test a query for employees who are managers, I’d have to create Departments and Job Titles and Salary Types etc., etc., etc.. Dis-like.

What problem am I trying to solve?

In the .NET space developers tend to want to use LINQ to access, filter, and project data. NHibernate (partially) supports LINQ via an extension method off of ISession. Because ISession.Query<T> is an extension method, it is not stubbable with free mocking tools such as RhinoMocks, Moq, or my favorite: NSubstitute. This is why people push you to use the Sqlite solution—because the piece of the underlying interface that you want to use most of the time is not built for stubbing.

I think that a fundamental problem with NHibernate is that it is trying to serve 2 masters. On the one hand it wants to be a faithful port of Hibernate. On the other, it wants to be a good citizen for .NET. Since .NET has LINQ and Java doesn’t, the support for LINQ is shoddy and doesn’t really fit in well the rest of the API design. LINQ support is an “add-on” to the Java api, and not a first-class citizen. I think this is why it was implemented as an extension method instead of as part of the ISession interface.

I firmly disagree with Ayende on Generic Repository. However, I do agree with some of the criticisms he offers against specific implementations. I think his arguments are a little bit of straw man, however. It is possible to do Generic Repository well.

I prefer to keep my IRepository interface simple:

    public interface IRepository : IDisposable
    {
        IQueryable<T> Find<T>() where T: class;

        T Get<T>(object key) where T : class;

        void Save<T>(T value) where T: class;

        void Delete<T>(T value) where T: class;

        ITransaction BeginTransaction();

        IDbConnection GetUnderlyingConnection();
    }

 

Here are some of my guidelines when using a Generic Repository abstraction:

  • My purpose in using Generic Repository is not to “hide” the ORM, but
    • to ease testability.
    • to provide a common interface for accessing multiple types of databases (e.g., I have implemented IRepository against relational and non-relational databases) Most of my storage operations follow the Retrieve-Modify-Persist pattern, so Find<T>, Get<T>, and Save<T> support almost everything I need.
  • I don’t expose my data models outside of self-contained operations, so Attach/Detach are not useful to me.
  • If I need any of the other advanced ORM features, I’ll use the ORM directly and write an appropriate integration test for that functionality.
    • I don’t use Attach/Detach, bulk operations, Flush, Futures, or any other advanced features of the ORM in my IRepository interface. I prefer an interface that is clean, simple, and useful in 95% of my scenarios.
  • I implemented Find<T> as an IQueryable<T>. This makes it easy to use the Specification pattern to perform arbitrary queries. I wrote a specification package that targets LINQ for this purpose.
    • In production code it is usually easy enough to append where-clauses to the exposed IQueryable<T>
    • For dynamic user-driven queries I will write a class that will convert a flat request contract into the where-clause needed by the operation.
  • I expose the underlying connection so that if someone needs to execute a sproc or raw sql there is a convenient way of doing that.
6Aug/137

Isg.EntityFramework 0.8.0 Released

InterceptionContext is now passed to TypeInterceptor methods and ChangeInterceptor methods. This may result in breaking changes depending on if and how you have inherited those classes, but I’ve done my best to preserve existing behavior. I marked the obsolete methods as such.

The purpose of this change is enable the scenario where you want to write a log-record back to the database when a record is saved or deleted.

Update:

0.8.0 did not contain the updated assemblies.

0.8.1 does.

What happened?

My build server is configured so that it only creates and publishes packages from the last pinned build. I forgot to pin the build that has the changes. I've pinned the build and republished 0.8.1. I've created a workitem for myself to separate package creation from package publication so that I can inspect the package before it's sent to Nuget.