Sorry that I can't see your results, because I don't have any glasses to test this stuff with. (I've always preferred parallel stereo pairs m'self)
Remember that you need to take distance into account, so simply shifting the entire image by a so many pixels would not be enough. For proper perspective, the red & blue offsets need to be a function of the "distance" from the viewer: "nearer" pixels in the scene get shifted more, "distant" pixels get shifted less.
For a program like @bplus's 3D cubes, you could divide the image into layers of cubes along the Z axis and shift each layer according to distance. This would give a "ViewMaster effect" where you have several flat objects hovering in front of each other, but it would be something.
The best solution would be to have a 3D engine to generate the left-eye image (with full perspective) from one point of view, then generate the right-eye image from a point of view slightly to the right.
Remember that you need to take distance into account, so simply shifting the entire image by a so many pixels would not be enough. For proper perspective, the red & blue offsets need to be a function of the "distance" from the viewer: "nearer" pixels in the scene get shifted more, "distant" pixels get shifted less.
For a program like @bplus's 3D cubes, you could divide the image into layers of cubes along the Z axis and shift each layer according to distance. This would give a "ViewMaster effect" where you have several flat objects hovering in front of each other, but it would be something.
The best solution would be to have a 3D engine to generate the left-eye image (with full perspective) from one point of view, then generate the right-eye image from a point of view slightly to the right.