Most states now require that you have Car insurance on any vehicle you drive. This is to protect you as well as the other party if an accident was to occur. In my opinion, without it, you're risking everything you own.
laurawilliamsmusings.blogspot.com/2007/10/what-would-we-do-without-insurance.html