Car Insurance Dealers
Car dealerships in the United States
In the United States, a car dealership is a business that sells cars. A car dealership can either be a franchised dealership selling new and used cars, or a used car dealership, selling only used cars. In most cases, dealerships provide car maintenance and repair services as well as trade-in, leasing, and financing options for customers.
Source: Wikipedia
Text is available under the Creative Commons Attribution-ShareAlike License; additional terms may apply.