:::: MENU ::::

ALT.NET; London; 13 Sept 2008

Intro

Debate over what ALT.NET is; should it have a set of guiding principles like the Agile manifesto?

Continuous integration & deployment

There seemed to be 3 major areas where people encountered difficulties doing continuous integration & deployment.

 

  1. Configuration files
  2. DB schema migrations
  3. Data migrations.
Best practise approaches discussed were:
Config files
  1. Make sure that  your config files are small. and contain only that config data that changes often (DB connection strings, file paths etc).  Put all your “static” config data into separate files (DI injection config etc).
  2. Consider templated config files; where specific values are injected during deploy process.
  3. Keep all config in simple text files in source control.
DB schema migrations
  1. Migration techniques borrowed from Ruby on Rails – generate change scripts by hand or using tools like SQL Compare; and then apply them using a versioning tool like dbdeploy.
DB data migrations
  1. Take backup before data migration.
  2. Ensure app fails quickly if is a problem; cause if data has changed since deployment then cannot rollback.
  3. Consider apps upgrading themselves and running smoke tests upon startup – and refusing to run if there is a problem – this technique is used by established opensource projects – WordPress, Drupal, Joomla.
Mentioned tools: TFS, Subversion, CC.NET, Jetbrains TeamCity, dbdeploy, SQL compare.
Acceptance testing
It seemed to me that the majority of pain experienced in this area results from a lack of a ubiquitous domain specific language:
  • Build a DSL incrementally during short iterations.  Gives you opportunity to refine, fill in gaps, and train whole team to use same language.
  • Without a DSL, acceptance testing via the UI testing becomes brittle, as you end up specifying your tests at too low a level, (click button A, then check for result in cell B); rather than having a translation from acceptance tests in a higher DSL language to specific UI components.
  • Consider prioritised tests – have a set of facesaving tests / smoke tests that always work, and ensure major things are still working (company phone number correct?  Submit order button still work?).  Acceptance tests can be thrown away if they have served their function of evolving the design / team understanding.
  • The acceptance testing trio – Developers test for success – thus automated testing only tests happy flow – still need exploritory testing by someone with testing mindset; what happens if you do weird stuff?  Tester must have domain knowledge.  Business – what are should happen?  Don’t let developers be forced to make up business rules?
  • Ensure all layers of stack (tests, manuals, code, unit tests) use the same DSL language.
  • How do you get workable acceptance tests – see Requirements Workshops book
  • Short iterations – more focus, incremental specs, opportunity to discuss missing test examples.
  • Key is having a ubiquitous language encoded as a DSL (domain specific language) – develops over time, enables automated accpetance tests, 
  • Sign off against acceptance tests (Green Pepper tool – capture & approve acceptance tests)
  • Talk: The Yawning Gap of ?? doom – infoQ, Martin Fowler
  • Avoid describing these activities as “testing” – people avoid because testing has low social status.
Mentioned tools:  White for Windows GUI testing
Domain driven design
  • Discussion around the difference between DDD; where we treat the concepts & actions as central; vs DB centrered design, where we’re thinking about the data as central, and UI centered design, where the screens are considered central.
  • Concensus was that domain shouldn’t be tightly bound to the DB, or the UI.
  • Ideas around passing DTO objects up to view (UI, webservices etc), andchange  messages bad from view, indicating how the domain should be changed (rather than passing the whole DTO, where you don’t know what has changed).
BDD
  • Defined as Dan North’s Given, When, Then
  • Is it any difference from Acceptance testing? Only that it is better branding, because BDD doesn’t have the word “testing” in it; which prevents people being switched off hearing the word test when discussing specifications.
  • BDD is writing failing acceptance testing first; before writing code.  
  • Unit testing is ensuring that the code is built right, but acceptance testing / BDD ensures that the right code is built.
  • Toolset is still immature.  Fitnesse .NET & Java tooling is most mature toolset.  Many BDD tools (other than Ruby’s rSpec) have been started and abandoned (nBehave, nSpec etc)
  • BDD is not about testing, its about communicating and automating the DSL.  Be wary of implementing BDD in developer tools (e.g, nunit), which prevent other team members (business, customer, testers) from accessing them.
  • Refactoring can break fitnesse tests, because it isn’t part of the code base.
  • Executable specs (via acceptance tests) are the only way to ensure documentation / test suites are up to date & trustable
  • Agile is about surfacing problems early (rather than hiding them until its too late to address them).  So when writing acceptance tests up front is difficult; this is good, because you are raising the communication problems early.
  • The real value is in building a shared understanding via acceptance criteria; rather than building automated regression test suite.
  • Requirements workshops can degenerate into long boring meetings.  To mitigate this problem
Tools:  Ruby Rspec, JBehave, Twist, Green Pepper
Feedback
In the post conference feedback; everyone was overwhelmingly positive; and found the open spaces format very energising.  Fantastic sharing of real world experiences; introductions to new approaches, nuggets of information; great corridor conversations.  Format that allows human interaction.
Next ALT.NET beers on 14th Oct.
Next ALT.NET group therapy in Jan 2009, with larger ventue.

One Comment

So, what do you think ?