If you want to enable Remote Desktop with a blank password on window 8, this is the same procedure as enabling rdp on windows 7 instructions can be found here
As a collegue suggested enabling this is a security risk.
Wednesday, December 19, 2012
Saturday, July 28, 2012
Migrating Legacy Applications
Legacy Systems
Legacy systems are systems that have grown over the years and became harder and harder to change and or are build on outdated technologies. Since business is changing rapidly these systems no longer can keep up the pace. Certainly in people that sell services products that are sold can be very complex, take for instance insurances.Often these systems are the core of the systems, if these systems go down you the company looses money, in certain areas you can loose thousands of dollars per minute you are down.
Symptoms
Signs that tell you that your current system will become unstable can be:- Downtime
- More and more bugs, regression
- Implementing new features takes longer and longer
- Solving bugs takes longer
- Unable to scale up your system
- Upstaffing of your operational team
Usually when you start experiencing the negatives effects of these systems your legacy system will need to be changed quite urgently.
These symptoms will result into an increased TCO. Keeping the system operational will become increasingly difficult since people that operate these systems become older, skills required will become very rare.
Approaches
There are several different approaches that can be taken to modernize your legacy system.- Garbage in-garbage out: for certain languages there are tools that convert your source code from one language to another like vb6 to vb .net, vb .net to c#
- Big Bang Migration: full migration of a system, you rewrite your whole application at once and one magic day you put everything at once in production.
- Incremental Migration: You partition your system in smaller subsystems and you put your system in production increment per increment.
Migration by tool
This approach can work if for instance you need to upgrade your system from one version to another. The biggest risk here is that when you do this you will not solve the fact that your software is difficult to adapt. It's an illusion that a system will be easier to adapt in a new technology than it was in the older one. What can be an added value of new technology is better development tools helping you with better support for refactoring to bring your system in a condition that is more suited to change and easier to change.Big Bang Migration
This approach where you rewrite application from scratch and migrate your data from the old system to the new system. The biggest risk here is that when your project fails the entire project fails, the outcome becomes more unpredictable the bigger the project gets. The risk in this undertaking are generally huge and need to be assessed carefully.This approach can be successful in projects that are quite small, your data is of high quality, you know your data structures and don't have a lot of dependencies on other systems.
There also is no guarantee that the new system you build will not start to show the same symptoms as your old one. If you do not put in place good programming techniques and adhere very strict quality guidelines the chance is big that the new system will become hard to change quite quickly.
Another downside is that business will not stop the evolve either you will be faced with the fact that you will need to implement old functionalities and in the mean time keep track of new developments also.
The cost is very high since you need to be operating your legacy system and in the meanwhile develop a new one which requires input from the people that are operating your legacy system which will have low availability due to the fact the need to keep the business afloat.
This approach is often chosen but often fails or costs enormous amounts of money.
Incremental Migration
When performing an incremental migration. You will have to divide your legacy system into smaller functional components and migrate each one step by step. The partitioning needs to be done just-in-time, enough to keep you busy.The huge upside of this approach is that its the risks are highly controllable and that if something goes wrong the feedback is rapid and the impact of the failure is limited and predictable.
Another big advantage is also that you will be forced to build your system in such a way that is easily extended and changed.
The downside of this is that the people that operate your legacy system need to be actively involved with the new developments since they have indispensable knowledge about how the old system and the business operates, since they are responsible for operating the old system their availability might be quite limited. Combined with a significant increase in cost having to keep two teams.
It need to be taken in mind also that when doing an incremental modernization that every step you take needs to able to be rolled back so in the beginning this will require a significant effort to easily and rapidly deploy and rollback your changes.
Sunday, July 08, 2012
Git and Visual Studio 2012
Some points to take into mind when using Git in combination with Visual Studio 2012, Resharper and NUGet.
Manually add following entries to your .gitignore file
- packages/
- TestResults/
################# ## Visual Studio ################# ## Ignore Visual Studio temporary files, build results, and ## files generated by popular Visual Studio add-ons. # User-specific files *.suo *.user *.sln.docstates # Build results [Dd]ebug/ [Rr]elease/ *_i.c *_p.c *.ilk *.meta *.obj *.pch *.pdb *.pgc *.pgd *.rsp *.sbr *.tlb *.tli *.tlh *.tmp *.vspscc .builds *.dotCover bin/ packages/ TestResults/ # Visual Studio profiler *.psess *.vsp # ReSharper is a .NET coding add-in _ReSharper* # Installshield output folder [Ee]xpress # DocProject is a documentation generator add-in DocProject/buildhelp/ DocProject/Help/*.HxT DocProject/Help/*.HxC DocProject/Help/*.hhc DocProject/Help/*.hhk DocProject/Help/*.hhp DocProject/Help/Html2 DocProject/Help/html # Click-Once directory publish # Others [Bb]in [Oo]bj sql TestResults *.Cache ClientBin stylecop.* ~$* *.dbmdl Generated_Code #added for RIA/Silverlight projects # Backup & report files from converting an old project file to a newer # Visual Studio version. Backup files are not needed, because we have git ;-) _UpgradeReport_Files/ Backup*/ UpgradeLog*.XML ############ ## Windows ############ # Windows image file caches Thumbs.db # Folder config file Desktop.ini ############# ## Python ############# *.py[co] # Packages *.egg *.egg-info dist build eggs parts bin var sdist develop-eggs .installed.cfg # Installer logs pip-log.txt # Unit test / coverage reports .coverage .tox #Translations *.mo #Mr Developer .mr.developer.cfg # Mac crap .DS_Store
Saturday, July 07, 2012
SharpPOS: The Personae
Friday, July 06, 2012
New Project
A few weekends ago a friend of mine lost his POS program during renovations. I told him maybe I could help him out and find him a new (open source) alternative instead of having to buy a new one. I deceided to give OpenBravoPOS a go, it is a neat system with tons of functionalities. However it has some quirks the most important one is that the Close Cash report, that is printed at the end of the day has not the possibility to print the total amount that has been sold per tax category, so it needs to be calculated afterwards which is a bit a pain in the ass...
Other things I noticed is entering products is really, but really a painful experience, reference and barcode need to be unique... when in fact we don’t need neither the reference neither the barcode. When entering the price of the product you need to set the amount of tax before you can enter the sell price + tax because the base of the calculation is the price excluding the tax is a bit annoying also.
So I came up with the idea to write him a brand new program called SharpPOS. First thing I did was write up the personas, I wrote one for the owner and one for a waitress. Then I went forward to identify the Minimal Marketable Product which was quit ok. the goal of my little project is to show the real power of Agile.
Wednesday, May 02, 2012
Friday, February 17, 2012
The difference between a Mock and a Stub
Many people have a lot of problems to understand the difference between a Mock and a Stub. Marting Fowler states that a Mock is about behaviour verification and a Stub is about state verification
From: http://martinfowler.com/articles/mocksArentStubs.html
Dummy objects are passed around but never actually used. Usually they are just used to fill parameter lists.
Fake objects actually have working implementations, but usually take some shortcut which makes them not suitable for production (an in memory database is a good example).
Stubs provide canned answers to calls made during the test, usually not responding at all to anything outside what's programmed in for the test. Stubs may also record information about calls, such as an email gateway stub that remembers the messages it 'sent', or maybe only how many messages it 'sent'.
Mocks are what we are talking about here: objects pre-programmed with expectations which form a specification of the calls they are expected to receive.
I am going to try to explain you in one simple example the difference...
The classes involved:
FileProcessor: The actual business logic, that we are trying to test, the Subject Under Test
File: some file we are processing
FileValidator: Performs some logic on the file to see if it contains errors
FileMover: Moves the file to a specific location depending on the results returned by the FileValidator
public class FileProcessor{
private readonly IFileMover _FileMover;
private readonly IFileValidator _FileValidator;
//Constructor
public FileProcessor(IFileMover fileMover, IFileValidator fileValidator){
IFileMover _FileMover = fileMover;
IFileValidator _FileValidator = fileValidator;
}
public void Process(File file){
ValidationResults result = _FileValidator.Validate(file);
if(result.IsValid){
_FileMover.Succes(file)
}else{
_FileMover.Error(file)
}
}
}
public interface IFileMover{
void Success(File file);
void Error(File file);
}
public class ValidationResults{
public virtual bool IsValid{
get;
}
}
public interface IFileValidor{
ValidationResults Validate(File file);
}
We need to create a dummy called DummyValidationResults
public class DummyValidationResults: ValidationResults{
public virtual bool IsValid{
get{ return false;}
}
}
Now we create a Mock MockFileMover
public class MockFileMover: IFileMover{
public void Success(File file){
this.SucessCalled++;
}
public void Error(File file){
this.ErrorCalled++;
}
public int SuccesCalled{get; private set;}
public int ErrorCalled{get; private set;}
}
We create a Stub named StubFileValidator
public class StubFileValidator: IFileValidator{
public ValidationResults Validate(File file){
return new DummyValidationResult();
}
}
now when we would like the test the Fileprocessor we can write:
File aFile = new FakeFile();
MockFileMover fileMover = new MockFileMover()
FileProcessor processor = new FileProcessor(new MockFileMover(), new StubFileValidator());
processor.process(aFile);
Assert.AreEqual(1,fileMover.ErrorWasCalled);
Assert.AreEqual(0,fileMover.SuccessWasCalled);
So what is the difference?
The stub returns a predetermined answer what needs to happen and the path through the code dictates it will always be called. As you can see the methods of the mock are enclosed in decision logic in this case an if statement (this can as wel be a try catch switch or whatever), then we verify if the correct method on the mock has been called... so the mock is responsible for failing or succeeding the test...
From: http://martinfowler.com/articles/mocksArentStubs.html
I am going to try to explain you in one simple example the difference...
The classes involved:
FileProcessor: The actual business logic, that we are trying to test, the Subject Under Test
File: some file we are processing
FileValidator: Performs some logic on the file to see if it contains errors
FileMover: Moves the file to a specific location depending on the results returned by the FileValidator
public class FileProcessor{
private readonly IFileMover _FileMover;
private readonly IFileValidator _FileValidator;
//Constructor
public FileProcessor(IFileMover fileMover, IFileValidator fileValidator){
IFileMover _FileMover = fileMover;
IFileValidator _FileValidator = fileValidator;
}
public void Process(File file){
ValidationResults result = _FileValidator.Validate(file);
if(result.IsValid){
_FileMover.Succes(file)
}else{
_FileMover.Error(file)
}
}
}
public interface IFileMover{
void Success(File file);
void Error(File file);
}
public class ValidationResults{
public virtual bool IsValid{
get;
}
}
public interface IFileValidor{
ValidationResults Validate(File file);
}
We need to create a dummy called DummyValidationResults
public class DummyValidationResults: ValidationResults{
public virtual bool IsValid{
get{ return false;}
}
}
Now we create a Mock MockFileMover
public class MockFileMover: IFileMover{
public void Success(File file){
this.SucessCalled++;
}
public void Error(File file){
this.ErrorCalled++;
}
public int SuccesCalled{get; private set;}
public int ErrorCalled{get; private set;}
}
We create a Stub named StubFileValidator
public class StubFileValidator: IFileValidator{
public ValidationResults Validate(File file){
return new DummyValidationResult();
}
}
now when we would like the test the Fileprocessor we can write:
File aFile = new FakeFile();
MockFileMover fileMover = new MockFileMover()
FileProcessor processor = new FileProcessor(new MockFileMover(), new StubFileValidator());
processor.process(aFile);
Assert.AreEqual(1,fileMover.ErrorWasCalled);
Assert.AreEqual(0,fileMover.SuccessWasCalled);
So what is the difference?
The stub returns a predetermined answer what needs to happen and the path through the code dictates it will always be called. As you can see the methods of the mock are enclosed in decision logic in this case an if statement (this can as wel be a try catch switch or whatever), then we verify if the correct method on the mock has been called... so the mock is responsible for failing or succeeding the test...
Subscribe to:
Posts (Atom)