A few months ago I was in a car accident. My hip was f*cked up (but not broken) and I couldn't work 4 alittle while. The guy that was driving (a friend) was shaken, but OK (The other car hit on my side.), though his neck has been real stiff.
Anyhoo, the insurance company is supposed to pay out (being that it wasn't our fault), but now they r bullsh*tin' w/ us. What's the sense of paying insurance when the only thing it insures that we don't benefit from it? I think the insurance industry is nothing but a big con, anyway. You pay and get nothing in return.
I returned to working F/T awhile ago, but my hip still f*cks w/ me, especially when the weather is acting up. My friend still hasn't had his car repaired & is still only working P/T. His neck is still giving him some discomfort.
Hopefully, things will work out, but it doesn't look good...
I HATE INSURANCE COMPANIES!!