Evaluating Project Management software is a delicate matter, as it potentially involves changing deeply rooted companies’ habits; in general, this holds for all work management software. Being the subjects of such scrutiny, we experience that most companies fortunately take the right way right from the start, which is, trying the software, inserting in it some real project and people data, and seeing whether it works. In the case of Twporject, this can be done quickly, and there is both a demo online and an easily installable version, and considerable support readily available online to all.
But unfortunately not all companies take this “hands on” approach: some take the path of putting together list of requirements, and instead of testing the software against those, ask the software producers “whether their software meets these requirements”.
Now imagine that this was the way you chose to buy a house: you went to the seller, with a list of features, and you never go to actually see the house. You buy it because the seller tells you that the house fills your requirements. Nobody would do anything so foolish, no?
Putting together requirements and testing the software against them is actually a good idea, if handled properly: testing will at times show that the requirements are contradictory, or need refinement, and that the software models problems actually better than how imagined in the requirements. This is not surprising, as widely used software includes a lot of experience in management, often more than the users have. So the initial requirements should be only a first indication, and not something that is the main criteria for the final choice, which should be led by the usability and coverage or real problems offered by the software at hand.
Unfortunately there are producers of PM software and consultants that encourage users along the foolish path: they give formal answers to formal requirements, do demos themselves instead of letting the customers do that, make phone calls to facilitate over-priced sales, and all the usual bad practices.
Well, we don’t do that. We concentrate all our energies in developing a better product, and in producing publicly accessible documentation; we want customers to have software that really works and is usable, not just to close formal deals.
One can make software that satisfies long lists of requirements, but is totally unusable, will be hated by users, and will lead to user rejection, and hence to a huge waste of company’s money and people time. We really hope never to take that path.
Silvia Chelazzi - Pietro Polsinelli
After some trying I have been testing several products. As I’m an Apple user I was pleased having XP as well on it as some programs were not Mac-savvy. I could even find out a lot without RTFM. Needed it though!
Imagine my surprise finding that TW worked at my Mac, with far less IT-skills than I had feared were necessary. I liked the way I could use IT-related projects as well as non related projects, like a major trade fair in march. After my own consideration, consulting other potential users at my customer and cracking some fine nuts today the company that hired me has bought the first batch of licenses.
Apart from the product it was the support that made my day and made my propose teamwork. For after all, even interactive manuals cannot replace a skillful person.
I agree in part. One can unfortunately not “test drive” all available options out there to start off with. I started with more than 150 possibilities. I’ve now brought this number down to 4 by using “elimination lists” which in part relied on feedback I got from websites, vendors, and even users. With 4 in hand, I’ve started testing on trial versions of the software.
I’ve learnt, yet again, how important it is to really listen to what it is the possible client is asking, instead of trying to tell them what they should allow you to tell them…
By the way, Teamwork will be tested next – based on what is available on your web site, the things you’ve said on your blog, the methodologies you subscribe to as an organisation, etc. Well done!
I think the whole stuff of software selection is incredibly context dependent. Every tool has pros and cons, but these have bigger benefits and drawbacks depending on the context.
Moreover, I started to consider usability as one of the key factors (I am tempted to say THE key factor, but that would not be context dependent…). Simply put: you can’t ask companies about usability of their product. The only way to address usability is just try the typical task and see how it fits with the typical person doing that task in your context… (you van obviously do a lot more, but that’s step number one).
Fixed, thanks.
Hey was checking out your site and noticed:
“Are you a Twitter addicted that traces all activities with twits? You can import all of those or only the relevant ones as timesheets in Teamwork. You can also send all your Teamwork worklogs to Twitter!”
I think you mean “Are you a twitter addict”.
I do dig the idea of the twitter integration though!