Monthly Archives: September 2013

The art room at Pauline’s school

Was at Pauline’s school to help her out with her art project. And I took the opportunity to take some photos of the interesting things there using the macro len.

First up is a series of shots on plaster of paris. The air bubbles that form during the pouring of the plaster created some very interesting structures. The problem was the poor lighting in the room and I did not have my flash with me.

P9280291-Edit P9280110 P9280087

Next up is the drying rack which had interesting lines. The image was processed using filters to give it a more interesting look.

P9280055-Edit

The metal gate had some interesting textures and here is one on the spherical portion of the gate.

P9280125

This last one was on the paint itself which really looked like some muscle structure but it was really just dried red paint.

P9280123

I got to remember to bring some lights for indoor shooting as these can really be dark for macro work.

HDR of the view from the house

Did a HDR using Nik HDR Efex Pro 2 which I recently brought as part of the Nik collection for HDR work. With Photoshop on the cloud and on a subscription basis, I wanted to move away from expensive Photoshop and Google was having a special for the entire Nik collection. So I brought that and the HDr Efex Pro 2 is really quite good. Definitely a step up from Photoshop’s HDR.

Took a couple of long exposure of the view from the house and did a HDR on it. Not too bad.

P9260918_HDR

Parallelization in R

I have been having pretty large data set for statistical testing recently. Large as in millions of comparisons which is really taking a lot of time. In the past, I have used plyr and the doMC parallelization backend. That took a lot of time. The tests themselves seem to finish fast enough, but the aggregation was taking nearly forever. Then I switched to using foreach with doMC which was better but not much. The initial code using plyr took more than 2 hours, switching to foreach and doing optimization took that down to about 1 hour.

Considering that I was using 10 cores, this was still slow when my laptop using another program could finish twice as many comparisons in 3 hours plus using 1 core.

Finally, I split the jobs using Python and run it using different processes and this brought it down to 15 mins using 10 cores. This was at least acceptable performance.

R has some really great libraries but getting scalable performance seems so difficult. One stupid use of some data structure can result in huge slow downs. With big data becoming more common place, is there a better statistical alternative to R? I tried pqR and that was not too bad, it was definitely faster than standard R but it is probably not enough. I have workloads that comprises of more than 3 billion comparisons that using Java and threading still requires more than a week of compute. Using R for that would be insane, but coding in R is so much quicker as the libraries are all there while I have to write it in Java. It is a serious dilemma.

For now, I will stick to writing R codes for single cores and then parallelize it using old fashion processes in Python. That seems to be the best performing so far.

Loop

Loop is a iPad app that allows one to create simple animated GIFs like the ones we used to do on our textbook pages.

Esther created the one below of a bouncing ball.

bouncing_ball

I created the one below of a badminton match.

badminton

Isn’t it cool. The best part is that it is really simple. Kids should have no problems with it. But its simplicity if probably the only issue with it once you grow out of it.