Marius Gheorghe

Building software : make it work, make it good, make it fast

The (inhouse) software development equation

It's a simple equation with 3 variables : time (and implicit money),  set of features/bugs ratio  and code quality. For best results the trick is not to overemphasize one over the remaining 2 and just keep all 3 balanced. Sounds simple but sometimes it feels like the hardest thing in the world.

 

Non functional requirements

Non functional requirements are just as important as functional ones. Yet not much thought is given to them (especially at the beginning when writing the specs for a new "system"). For instance everyone, obviously,  wants a secure application (also security is a bool, it's either secure or not, there are no intermediary steps) so a new application will not be released if it's insecure no matter how many functional requirements are implemented.

 The idea is to always keep in mind non functional requirements when designing a new "system". Some of the non functional requirements (like capacity/performance or security) can even have a direct impact over the functional requirements.

It's just a binding failure

A GitHub extension VS bug caused some serious damage for some dude. It was very sad to see the actual cause of the bug though. A frigging WPF checkbox binding failure case GitHub projects to always be created public.
It's funny because a lot of people who do XAML figured out that MVVM/binding is the wrong way to tackle UIs of a reasonable complexity but on the other hand it's still sad to see it fail so miserably.

Universal Windows apps - risking the quality of the apps

I've played around for a bit with a Lumia device which run the latest Window 10 mobile. To be honest the entire switch to "Universal Windows apps" doesn't seem to great from a user perspective.

The main thing is that not matter how great the API is, it is HARD to make a app "work" from a 4 inch phone (using touch as input) to a Xbox One running on a 70' TV using a controller for input. Different device sizes and different input methods are usually handled differently so "hiding" all this under a API won't necessarily make the apps great.

Just look at Android, even after all these years you can still find apps which suck when are running on a tablet (they don't make proper use of the entire screen estate). Nevermind the fact the the OS still doesn't support basic features like running 2 apps side by side (never mind the Samsung hacks, that has his own share of problems).

To me, with the universal runtime , it seems Microsoft is risking to have more apps which are mediocre on multiple devices rather than fewer apps which run and work great on each device type. Windows Mobile /Windows 10 won't compete with Android / iOS regarding the number of apps anytime soon (if ever).

Is it worth risking the user experience for "many apps" ? Personally i don't think so.

Immutable collections in .NET

Dealing with immutable collections in .NET  is very easy with the help of the ImmutableCollections package on nuget .

Here's a example :

ImmutableList initial = ImmutableList.Empty;
ImmutableList immutableList = initial.Add("stuff");


Any operation on the collection will create a duplicate collection (so in the above example the "initial" collection is still empty).
Another much simple way would to be use the IReadCollection from .NET 4.5
List initial = Enumerable.Range(1, 10).ToList();
ReadOnlyCollection readOnlyCollection = initial.AsReadOnly();

Now you can pass readOnlyCollection and obviously the initial collection will remain unchanged.

Few thoughts on Jade templating language

I've started using Jade on a pet project a few weeks ago. It's a nice library that saves you a few keystrokes while writing HTML but, sadly, it still has 2 things i really dislike :

- it's too different from "regular" HTML to be picked up by beginners and thus creates a barrier to entry. Also the "size" savings are not that impressive compared with "regular" HTML.

- it allows embedding a subset of JS directly inside of markup which can easily become a pain in the ass to debug. Breaking separation of concerns is a big no no from my POV.

The build script

Ah, the build + optional deploy script. The "thing" that turns whatever your compiler spits out into a zip/msi/whatever that you can actually install /pass along to customers. For build scripts i've started with batch files, C# , Powershell, moving to msbuild tasks (yeah, that was "fun" ), FAKE, to the node.js "task runner" framework du jour (Grunt, Gulp etc)  to finally get back to Powershell. 

  I still think the best solution for writing a build script is in a shell scripting language (Powershell, bash, whatever) because :

- when it will crash (and it will !!) you're debugging code written by you instead of fuzzing around with a stacktrace spitted up by some shitty Grunt plugin (for instance).

- ubiquity : only powershell/bash is required to run it (compared with node.js + npm + gulp + whatever other plugins you are using).

- simplest way to run 3rd party CLI apps as part of build process ( "& filePath args" and you're done).

- everything is in one place (no package.json, gruntfile(s) and so on). A single file that handles everything.


Real world sample of why sanity testing is important

Here's a very interesting real world sample of why simple sanity tests before a release are important. Metro 2033 Redux and Metro Last Light Redux have just been released with a giant bug (both games crash right at startup). Apparently they only crash on CPUs who do not support SSE 4.1 (most likely the "bug" is because they compiled with very aggressive optimizations and a  +SSE4.1 instruction is generated).

Obviously they would have found this big problem just by running the game on multiple machines with different configurations.

Typescript and static typing

My favorite thing about Typescript is , without a doubt, static typing. No more dynamic bullshit, now you can actually have strong typing for DOM interaction (for instance) :

var upload: HTMLInputElement = document.getElementById("fileUpload");

And obviously the spiffy intellisense :

Few tips and tricks for using Powershell as a shell language (part I)

Navigating around the filesystem

The first thing you have to do is add some "shortcuts" to your usual paths. The simplest way to do that is to edit the PS profile file (you can find this file in [[MyDocuments]]\WindowsPowerShell\Microsoft.PowerShell_profile). The "shortcuts" can be added in 2 ways (with advantages and disadvantages) :

- add the shortcut as a variable which points to the right path.

$dx = 'c:\dropbox'

To use it just cd %dx and you'll navigate to c:\dropbox. The advantage of using vars is that you can reuse them if you write some code interactively. The disadvantage is that you'll always have to "cd" manually to actually navigate to that path.

- if you don't really care about "reusing" that variable, you should do declare it like as a function:

function %dx
{
   cd c:\dropbox
}

To use it just type %dx and you'll navigate to that path. Also i really recommend you to use a standard naming convention for this (i prefix the "shortcuts" with the % char).
For navigation around the filesystem, "ls" is used to list the files/subfolders in the current path. Obviously you can also filter the result like this :

ls *.pdf

This will list just the pdf files from current path.

File operations

With PS , you can use the old DOS commands (like mkdir, del etc), the PS "native" commands or even the Linux utilities ( try GOW the lightweight Linux alternative ). Personally i prefer the old DOS commands, but the important thing to note here is you have the freedom to use 3rd party utilities instead of the builtin cmdlets if they do a better job. You can even add functions to your PS profile that invoke other utilities with built in cli parameters to save you some typing.

So here's how i do most filesystem operations:

- delete
The DOS command is del myfile.txt to delete a single file. It also works with wildcard selector del my*.jpg code>
The PS "native" way is using the Remove-Item cmdlet. Something like Remove-Item myfile.txt. But, obviously, you don't want to type "Remove-Item" each time so you can use the alias "rm".
If you prefer the "unix" way, just run the GOW rm utility: rm myfile.txt

- create folders / files
mkdir myFolder to use DOS/Linux command or New-Item MyFolder to use the builtin cmdlet. For files you usually want to edit it after creation. Personally i just run vim to create and edit the file using a function defined in my profile. Something like vimedit a.txt where vimedit will just run vim with the specified path as a parameter

- copy/move files
Copying and moving files from the command line is not the most straightforward thing ever but it's manageable. Usually i use copy/xcopy (built in Windows tools) or robocopy . Personally, i never really remember all the cli switches to do more advanced copy/move operations so i usually rely on FarManager (since it can be runned inside the terminal)

- grepping for text inside files For finding specific text pattern inside multiple files i'm using a Select-String wrapper like :

function psgrep($path, $fileType, $string)
{
   get-childitem $path -include $fileType -rec | select-string $string
}

Use it like
psgrep c:\oss\ *.cs "static" . This will search for the keyword "static" in filetypes *.cs from path c:\oss

- searching for files by name :
I use a Get-ChildItem wrapper like this
function psgrepFile($path, $fileType)
{
  get-childitem $path -include $fileType -rec
}
Invoke it like this :
psgrepFile c:\oss a*.xml

Lightweight tools and workflow with SqlServer

SqlServer database server and associated tools become more and more heavy and cumbersome with each new version. It's a bit ridiculous to install 2GB of stuff just so you can develop something on your local machine. Fortunately, there are lightweight alternatives. Here's what i have ended up using for this scenario:

- Install SqlServer 2014 LocalDb. It's available here (linked directly the x64 version). This is the SqlServer version created especially for this scenario.

- after installing it, you can connect directly to this server using sqllocaldb.exe (located in C:\Program Files\Microsoft SQL Server\120\Tools\Binn). The notable CLI switches are : -start and -stop (to start/stop the default server) . The entire list of cli switches is available at here.

- create your database using the default server instance (recommended). By default the database files are stucked in your UserProfile folder but you can override this to put the db file wherever you want:
 create database foo on (name='foo', filename='c:\DBs\foo.mdf')

- if you need a lightweight GUI alternative to Management Studio i recommand Query Plus Ex . You can connect to the default LocalDb server using the server name (localdb)\MSSQLLocalDB

- if you prefer CLI to the GUI tool, you can use sqlcmd.exe (which sadly is missing from the LocalDB install). You'll have to install SqlServer Express edition to get sqlcmd.exe

- update your app connection string to point to the new server and off you go.

So what are you ?

I ask this question pretty often to my fellow colleague developers. I'm actually very interesting to see how exactly they perceive the work they do. Most of the time i get back the boring (and untrue) "engineer". Most of the people "doing software" seem to think they are engineers. This can't really be further from the true (if you don't believe me, you should speak with a "regular" engineer from other areas and compare your work with his/her).

Another category of people think of themselves in more abstract terms : poets (because writing code is exactly like writing poetry, right ?) or even philosophers. Obviously, this is bullshit....

Other people think of themselves as craftsmen (personally i think this is a lot closer to the truth). At the end of the day, personally, i think building software is a lot like regular craftsmanship.We use tools just like other craftsmen (instead nails and hammers we use IDEs and debuggers) to "craft" something. The result of our work is , hopefully, something that other people enjoy to use.

What i don't understand is why do people get upset when i told them there isn't much of a difference between what they do and people who build chairs...