Our COE system gives more problems than good

Letter from Chua Yao Kun

After two decades, the Certificate of Entitlement system has proven to be a blunt tool that is nowhere near the ideal of economic distribution of limited resources. Neither has it discouraged vehicle population growth.
After two decades, we can establish the relationship of COE prices to buyer behaviour.
When COE prices are high, the cost of replacement is significantly higher than the cost of maintenance and one would keep the old vehicle on the road longer. This contributes to more pollution.
When prices are low, vehicles of relatively young age are scrapped to recover the COE acquired previously at high cost. This is unnecessary cash outflow for Singapore.
Even if a vehicle is not scrapped, the owner is likely to take advantage of the low prevailing quota premium ahead of COE price increases.
From this observation, one can see that the COE system has more effect on the average age of the vehicle population than on controlling the car population.
Instead of positive social and economic benefits, COEs have contributed to two negative social phenomena.
Firstly, vehicle ownership is increasingly concentrated in the hands of the rich rather than being allocated to activities that promote economic value.
Having different COE categories does not isolate different user groups.
As prices in Category B and the Open Category go higher, some prospective buyers without enough financial power are competing in Category A instead, going by the increased number of Mercedes, Volvo and other luxury makes in this category.
As Category A prices rise, some prospective buyers are being squeezed out and turning to commercial vehicles for their personal transport, going by the sofas I have seen in the back of those small vans.
In a nutshell, the different categories are cosmetic. That is why the COE prices rise and fall in lockstep.
Businesses that need transport are the ones hardest hit. Regardless of the economic and competitive environment, they are passive price takers of the high COE price, while the rich throw their excess cash around.
There is no economic redistribution of resources.
Secondly, COEs drive inflation. Taxi companies, logistics companies and any company involved in moving people and goods would be forced to transfer their escalated costs to consumers, many of whom are not car owners.
A quota on vehicle population does not need a highest bidder system to administer.
While the COE system is an attractive avenue to generate cash, society pays a hefty price.
It is a broken system that has no place in Singapore today, especially when we are trying to promote an inclusive society and arrest inflation.

———————————————————————-
Everything in Singapore is going downhill because nothing seems to work anymore. Does our COE system allocate resources efficiently? It seems to be only a tool to increase the government coffers at all costs. Who would imagine our taxi’s, that supposed to serve our needs has been inflated so high by paying for a COE and diesel tax that no longer reflect the markets. Not only our MRT system is in a mess, what is our Transport Minister doing? By embracing CNG with tax breaks and then increasing it to be almost similar to petrol? When the rest of the world is readying for battery operated cars to be environmental friendly to offset the needs of petrol, Singapore is falling behind in everything. Even our transport for business has been inflated so high with property prices that we are no longer competitive anymore. Who is responsible for removing air bags from taxis? The problem is that they removed the airbags from the taxi to save costs, that results in both deaths in the taxi when the Rochor accident happens, maybe the airbags could at least save a life, what a pity, someone is going to take the fall for this. 
We demand that you ;
1) Scrap diesel tax because it is no longer reflective of market conditions, lower CNG to market prices and do a feasibilty test if Singapore is viable to embrace battery powered cars.
2) Increase the supply of COEs for commercial vehicles.
3) Revoke the requirements for new taxis to pay COE, and refund the COE paid for existing taxis. This will easily increase the income of taxi drivers and create a demand for at least 10,000 new taxis for jobs to compensate, indirectly paying for the loss, where most are state owned taxis, it is the right hand moving to the left hand. What will be the impact? Taxi driver’s income will be doubled, indirecting creating demand for at least 10,000 new jobs.  
Who is trying to bluff when an inflation rate of 5% does not affect us? When will our Ministers wake up? After the Hougang SMC elections? Time is short and in less than 4 years, 2016 will be the reckoning day. Singapore must move away from bad policies and concentrate on growth and investments on innovations and technologies to promote jobs and minimum wages, I can easily rewrite the entire economy, but the PAP government is too stupid to listen.
– Contributed by Oogle.

Advertisements

World Bank Data is now online, anyone keen to find our missing billions?

Singapore’s Data from the World Bank

http://databank.worldbank.org/ddp/home.do?Step=2&id=4&hActiveDimensionId=WDI_Series


Title: The 15 Countries That Are Buried Under The Most Debt (Apr. 4, 2011)

http://www.businessinsider.com/debt-…#ixzz1vJfuboev
 
List of countries by public debt

http://en.wikipedia.org/wiki/List_of…by_public_debt

Debt-to-GDP ratio: explanation

http://en.wikipedia.org/wiki/Debt-to-GDP_ratio

Cash surplus/deficit (% of GDP)

http://data.worldbank.org/indicator/…-last&sort=asc
 
Central government debt, total (% of GDP)

http://data.worldbank.org/indicator/…-last&sort=asc

There is so much data in the World Bank’s database, I can easily find out and create any views to know every single secrets, every single policies, anything I want to know, and not only Singapore, but the entire world, who will be the first to find out our missing billions?
– Contributed by Oogle. 

Theories of Disease : Drugs will affect DNA Mutations

AsiaOne
Monday, May 28, 2012 

SINGAPORE – Scientists from the Genome Institute of Singapore (GIS) have unraveled the mystery behind the cause of liver cancer in a research partnership with various research institutes across the world.
A report, published in the Nature Genetics journal on May 2, explained that high exposure to the hepatitis B virus (HBV) could cause HBV to integrate into an individual’s genes.
The risk of liver cancer developing for an individual with high exposure to HBV is 100 times more than that of an individual without exposure to HBV.
This latest study managed to bypass limitations experienced in the past by leveraging on technology, which allowed for the survey of HBV integrations in tissues from 88 Chinese liver cancer sufferers.
The discovery means that medical researchers now have better understanding of HBV integration and liver cancer, and are better equipped to develop improved therapies for the disease.
Liver cancer is one of the most common solid tumours worldwide.
ljessica@sph.com.sg


Rare DNA mutations are so plentiful in the human genome that they make it difficult to precisely identify the genetic switches that cause many common human diseases, two studies found.
The data, released yesterday in the journal Science, shows that the vast majority of genetic variations found in people are rare and evolutionarily recent. Well-known DNA variations that are common across large populations probably don’t widely affect many illnesses, the authors said.
The research means it may be more difficult to isolate the roots of ailments such as diabetes and heart disease, and cures will be more elusive, said Joshua Akey, associate professor of genome sciences at the University of Washington in Seattle and an author of the study.
“The task of correlating individual variants with particular diseases is probably more complicated than we would have anticipated a few years ago,” Akey said in an interview. “It’s exciting because we’re starting to see patterns of variation we were never able to access before because the technology wasn’t there, and it’s frustrating because we don’t know what it means.”
In a study of 2,440 people from Europe and Africa, the researchers discovered about a half-million mutations, most of which were rare, novel or population-specific. The second effort, led by Matthew Nelson and Vincent Mooser of London-based drugmaker GlaxoSmithKline Plc (GSK), targeted DNA that was already considered to have potential for medical development.

Personalized Medicine

The findings highlight issues involved in the trend toward personalized medicine, in which drugmakers seek to determine whether a patient is genetically susceptible to a particular disease or would be especially responsive to certain treatments. More than 72 such therapies are available now, a fivefold increase from the 13 available in 2006, according to the Personalized Medicine Coalition, an industry advocacy group based in Washington.
“It could have significant implications for our health and medicine,” Eric Topol, director of the Scripps Translational Science Institute in La Jolla, California, said in an interview. “That whole common-disease, common-variant theory continues to have holes punched in it.”
Last month a different study concluded sequencing the genomes of patients to reveal what ailments might mar their futures wasn’t the best predictor for the most common diseases.

Twins Study

That study didn’t sequence individuals. Instead, researchers collected data from thousands of identical twins in five countries and used a computer model to determine the effectiveness of genome sequencing. They concluded most people would get negative results from having their genome sequenced for all except one of 24 identified conditions, including heart disease, diabetes and Alzheimer’s.
The key for future disease research is obtaining large amounts of genomic sequencing data from many individuals, Jacob Tennessen, a researcher at Oregon State University in Corvalis and lead author of the study, said in an interview. That, in itself, leads to other concerns, he said.
“The hard part is not getting the data, the hard part is analyzing the data,” Tennessen said. “We’re going to have so much information that figuring out the computational resources that we need to create to analyze the data, and the statistically tests we need to analyze the data, that’s going to be the hard part.”
To contact the reporters on this story: Ryan Flinn in San Francisco at rflinn@bloomberg.net; Reg Gale in New York at rgale5@bloomberg.net
To contact the editor responsible for this story: Reg Gale at rgale5@bloomberg.net

Open Source Web Crawlers in Java

Open Source Web Crawlers in Java
 

Heritrix
Heritrix is the Internet Archive’s open-source, extensible, web-scale, archival-quality web crawler project.
WebSPHINX
WebSPHINX ( Website-Specific Processors for HTML INformation eXtraction) is a Java class library and interactive development environment for Web crawlers that browse and process Web pages automatically.
JSpider
A highly configurable and customizable Web Spider engine, Developed under the LGPL Open Source license, In 100% pure Java.
Web-Harvest
Web-Harvest is Open Source Web Data Extraction tool written in Java. It offers a way to collect desired Web pages and extract useful data from them. In order to do that, it leverages well established techniques and technologies for text/xml manipulation such as XSLT, XQuery and Regular Expressions. Web-Harvest mainly focuses on HTML/XML based web sites which still make vast majority of the Web content. On the other hand, it could be easily supplemented by custom Java libraries in order to augment its extraction capabilities.
WebEater
A 100% pure Java program for web site retrieval and offline viewing.
Bixo
Bixo is an open source web mining toolkit that runs as a series of Cascading pipes on top of Hadoop. By building a customized Cascading pipe assembly, you can quickly create specialized web mining applications that are optimized for a particular use case.
Java Web Crawler
Java Web Crawler is a simple Web crawling utility written in Java. It supports the robots exclusion standard.
WebLech
WebLech is a fully featured web site download/mirror tool in Java, which supports many features required to download websites and emulate standard web-browser behaviour as much as possible. WebLech is multithreaded and will feature a GUI console.
Arachnid
Arachnid is a Java-based web spider framework. It includes a simple HTML parser object that parses an input stream containing HTML content. Simple Web spiders can be created by sub-classing Arachnid and adding a few lines of code called after each page of a Web site is parsed.
JoBo
JoBo is a simple program to download complete websites to your local computer. Internally it is basically a web spider. The main advantage to other download tools is that it can automatically fill out forms (e.g. for automated login) and also use cookies for session handling. Compared to other products the GUI seems to be very simple, but the internal features matters ! Do you know any download tool that allows it to login to a web server and download content if that server uses a web forms for login and cookies for session handling ? It also features very flexible rules to limit downloads by URL, size and/or MIME type.
Crawler4j
Crawler4j is a Java library which provides a simple interface for crawling the web. Using it, you can setup a multi-threaded web crawler in 5 minutes! It is also very efficient, it has been able to download and parse 200 pages per second on a Quad core PC with cable connection.
Ex-Crawler
Ex-Crawler is divided into three subprojects. Ex-Crawler server daemon is a highly configurable, flexible (Web-) Crawler, including distributed grid / volunteer computing features written in Java. Crawled informations are stored in MySQL, MSSQL or PostgreSQL database. It supports plugins through multiple Plugin Interfaces. It comes with it’s own socket server, where you can configure it, add urls and much more. Including user accounts and user levels, which are shared with the webfrontend search engine. With the Ex-Crawler distributed crawling graphical client, other people / computers can crawl and analyse websites, images and more for the crawler. The third part of the project is the web front end search engine.

Graphene Barristor : The 300 GHz Chips is possible but what about the other components of a computer, you do not have a complete solution

Samsung’s greatest R&D division, the Samsung Advanced Institute of Technology has reportedly published and detailed research on the Barristor in the journal Science on the 17th of May. This is the wonder chip that researchers were looking for.
We’ve talked about the amazing properties of Graphene before here. And we also reported on silicene’s arrival to the market before graphene in this article.
The most important fact about graphene is that this material allows electrons to move through it 200 times easier than silicon.
Practically, if you build a CPU using graphene instead of silicon, it will likely run at 300 GHz instead of 3 GHz, considering that the technology only achieves half of the normal graphene electron mobility.
IBM even thought about theoretical 1000 GHz chips using graphene.
Graphene was a difficult material to work with because it is a semi-metal, and the electron flow can’t be easily stopped.
Using it in cooling solutions is indeed a good idea as no current is passing through it, but the higher electron mobility might help dissipate the heat energy faster in a cooling base.
There was talk about a combination between silicon and graphene, but it seems that such a combination is not desirable.
Trying to transform graphene in a semiconductor radically decreases the much wanted electron mobility. That was the silicene research.
Getting back to our wonderful Samsung graphene chip, we can tell you that the Korean scientists developed the graphene-silicon Schottky barrier.
This will allow or stop the electron flow by controlling the height of the barrier.
Samsung also built a basic logic gate device and demonstrated that the device can successfully fulfill one function. The basic operation the device can manage is adding.
The transistor-like gate was called a Barristor.
Samsung’s research institute owns other 9 major graphene-related patents and is currently the only company to have built a working graphene computing micro device.

Everything on a chip, CPU, Ram and OS for mobile devices

ScienceDaily (May 18, 2012) — The first purely silicon oxide-based ‘Resistive RAM’ memory chip that can operate in ambient conditions — opening up the possibility of new super-fast memory — has been developed by researchers at UCL.

Resistive RAM (or ‘ReRAM’) memory chips are based on materials, most often oxides of metals, whose electrical resistance changes when a voltage is applied — and they “remember” this change even when the power is turned off.
ReRAM chips promise significantly greater memory storage than current technology, such as the Flash memory used on USB sticks, and require much less energy and space.
The UCL team have developed a novel structure composed of silicon oxide, described in a recent paper in the Journal of Applied Physics, which performs the switch in resistance much more efficiently than has been previously achieved. In their material, the arrangement of the silicon atoms changes to form filaments of silicon within the solid silicon oxide, which are less resistive. The presence or absence of these filaments represents a ‘switch’ from one state to another.
Unlike other silicon oxide chips currently in development, the UCL chip does not require a vacuum to work, and is therefore potentially cheaper and more durable. The design also raises the possibility of transparent memory chips for use in touch screens and mobile devices.
The team have been backed by UCLB, UCL’s technology transfer company, and have recently filed a patent on their device. Discussions are ongoing with a number of leading semiconductor companies.
Dr Tony Kenyon, UCL Electronic and Electrical Engineering, said: “Our ReRAM memory chips need just a thousandth of the energy and are around a hundred times faster than standard Flash memory chips. The fact that the device can operate in ambient conditions and has a continuously variable resistance opens up a huge range of potential applications.
“We are also working on making a quartz device with a view to developing transparent electronics.”
For added flexibility, the UCL devices can also be designed to have a continuously variable resistance that depends on the last voltage that was applied. This is an important property that allows the device to mimic how neurons in the brain function. Devices that operate in this way are sometimes known as ‘memristors’.
This technology is currently of enormous interest, with the first practical memristor, based on titanium dioxide, demonstrated in just 2008. The development of a silicon oxide memristor is a huge step forward because of the potential for its incorporation into silicon chips.
The team’s new ReRAM technology was discovered by accident whilst engineers at UCL were working on using the silicon oxide material to produce silicon-based LEDs. During the course of the project, researchers noticed that their devices appeared to be unstable.
UCL PhD student, Adnan Mehonic, was asked to look specifically at the material’s electrical properties. He discovered that the material wasn’t unstable at all, but flipped between various conducting and non-conducting states very predictably.
Adnan Mehonic, also from the UCL Department of Electronic and Electrical Engineering, said: “My work revealed that a material we had been looking at for some time could in fact be made into a memristor.
“The potential for this material is huge. During proof of concept development we have shown we can programme the chips using the cycle between two or more states of conductivity. We’re very excited that our devices may be an important step towards new silicon memory chips.”
The technology has promising applications beyond memory storage. The team are also exploring using the resistance properties of their material not just for use in memory but also as a computer processor.
The work was funded by the Engineering and Physical Sciences Research Council.