This can actually be an alternative green vehicle.... if it works!
Here is the idea: When we jump, our leg muscles are pushing us up and or forward. The stronger they are, the higher or faster we can go.
We go back down, we are putting that energy back to the ground. The result is a bit of wasted energy, and a hit back from the ground to us (which is also unpleasant).
Now if we use a couple of springs with the right level of elasticity (which will depend on each person's weight and height) we can achieve two things:
1. The energy our body's weight is pushing down will be absorbed by the spring, and so we experience less bounce pressure in our legs (pleasant feeling)
2. We can use the same energy for jumping again (or next step of running) and so jump higher or faster.
I am not talking about these:
The problem with these is that they are not a natural fit for how "we jump" or "we run" because they are replacing the nature of the end of our legs by appending the spring to our legs.
What I am talking about is building some sort of "boots" that will add a relatively weak spring in parallel to our legs. In other words, our legs and brain are still doing the main thing, leading the jump or running, but we are effectively making our leg muscles stronger.
Any one interested in working on a proto type? I wish I had time!
In & out of Code
Who says a weblog needs a Description?
Microsoft Parallel Extensions - Implementor!
Microsoft Parallel Extensions to .NET Framework is a new set of tools and libraries being developed by MS to simplify parallel programming in this age of parallelism.
Silicon has already reached its limits in playing the electrons around to perform our "computations" faster. A higher "Giga Hertz" CPU is no more a solution for faster execution of programs. The solution apparently is having more CPUs, more cores and more modules. Seems like we are moving towards a "network topology" for processing, funny enough, just like the rest of the nature (apologies to fans of "God power" theory)!
A genuine solution for this issue is coming up with programming languages, compilers and run-times that take care of utilizing "many cores" to run the program representation abstractions (did anybody say Functional Programming is promising?).
Well until that day, we have to stick to a dirty workaround, i.e. Write programs the old way, but with parallelism in mind.
Alright, Microsoft is making that simple (for .NETers at least) and this is where "Microsoft Parallel Extensions" comes into play.
With some pretty straight forward API methods, thanks to Extensions methods and LINQ, parallelizing loops, etc can be implemented simpler. Just change your "foreach" loop with "Parralel.ForEach" method and your loop runs in parallel!
The sad point is you don't know which loops can be parallelized and so, you have to think about each one, analyze it, try it in parallel and if all good, persist it.
How many loops do you have in your application? I don't know about you, but there is no way I'm gonna check all my loops for a potential performance boost this way. It's simply humanly not possible.
So this is my idea (listen well MS folks!): Build a dynamic analyzer to "Suggest" which loops are worth considering for switching to a parallel world.
It's like a "solution in search of a problem". Normally you figure out you need to parallelize, and then evaluate technologies and tools to work it out. My suggestion is dynamically discovering long running loops to find a use for the services in Microsoft Parallel Extensions!
It's like "help me help you". But it works in my "pragmatic world of constant enhancements"! You don't have to be desperate to improve your software. Do it for the sake of making this world a little bit more beautiful and it will pay off!
Silicon has already reached its limits in playing the electrons around to perform our "computations" faster. A higher "Giga Hertz" CPU is no more a solution for faster execution of programs. The solution apparently is having more CPUs, more cores and more modules. Seems like we are moving towards a "network topology" for processing, funny enough, just like the rest of the nature (apologies to fans of "God power" theory)!
A genuine solution for this issue is coming up with programming languages, compilers and run-times that take care of utilizing "many cores" to run the program representation abstractions (did anybody say Functional Programming is promising?).
Well until that day, we have to stick to a dirty workaround, i.e. Write programs the old way, but with parallelism in mind.
Alright, Microsoft is making that simple (for .NETers at least) and this is where "Microsoft Parallel Extensions" comes into play.
With some pretty straight forward API methods, thanks to Extensions methods and LINQ, parallelizing loops, etc can be implemented simpler. Just change your "foreach" loop with "Parralel.ForEach" method and your loop runs in parallel!
The sad point is you don't know which loops can be parallelized and so, you have to think about each one, analyze it, try it in parallel and if all good, persist it.
How many loops do you have in your application? I don't know about you, but there is no way I'm gonna check all my loops for a potential performance boost this way. It's simply humanly not possible.
So this is my idea (listen well MS folks!): Build a dynamic analyzer to "Suggest" which loops are worth considering for switching to a parallel world.
It's like a "solution in search of a problem". Normally you figure out you need to parallelize, and then evaluate technologies and tools to work it out. My suggestion is dynamically discovering long running loops to find a use for the services in Microsoft Parallel Extensions!
It's like "help me help you". But it works in my "pragmatic world of constant enhancements"! You don't have to be desperate to improve your software. Do it for the sake of making this world a little bit more beautiful and it will pay off!
In the beginning...
Here I am, blogging now!
Not sure what I'm going to write about, but it will be a mix of personal thoughts about programming, software development, speaking in code, life, and how it can be improved.
I love building intelligence into computers, getting them to do what I otherwise have to do, and what I have to do is what others have to do. I don't think human being is any more respectable that the machine.
It's all about intelligence. I respect my Geeks Coder more than a coward fellow developer, when it does a better job.
Computers are easier to train than humans, and they do their duty more responsibly. They are just a bit too touchy (!) which doesn't matter for me. I take that one as my challenge.
I will also perhaps blog about my firm, the Geeks Ltd, the stuff we are doing and where we plan to go...
Comments and emails are always welcome. Web is about communication and we all need that, so please feel free to drop me a line, if you feel so.
Subscribe to:
Posts (Atom)