As a wound heals, tissue begins to grow over the wound, protecting it and replacing the damaged skin. As this fibrous tissue settles in, you develop a scar. Essentially, scars are nature’s way of ...