Women can gain a lot of influence in Hollywood. When I think of Hollywood, I automatically think of actors and actresses, I also think of working out and staying in great shape. In Hollywood there are many opportunities to become famous and have the chance to fulfill your dreams, in my opinion.
When I think about Hollywood a lot of mixed emotions come to mind. You have the ups of being famous and living the “perfect,” lifestyle that you can only dream about or see on the television. Surrounding yourself with people who strive to be better people physically and emotionally will eventually have influence on yourself to be better. When surrounding yourself with people who don’t care and don’t strive to be a better person then you will eventually do the same. Women can gain more respect for themselves living in Hollywood by always striving to do better, keeping their body in great shape and not eating unhealthy food. On the other hand, people think living in Hollywood you have to be that “perfect person.” People think they have to be a stick figure like models to be in that atmosphere, which is completely false.
People only relate Hollywood to super stars, movie productions and becoming famous. People forget that Hollywood is a city and or town meaning there are “normal” people living their everyday lives. The views there are spectacular and the beaches are gorgeous. Some of the shows produced in Hollywood I am not a fan of, for example “Teen Mom.” In this show, young ladies will audition for the role of a teen mom and I am not for getting pregnant just to be on a show. That is one of the downfalls of being in Hollywood, there is a lot press all over the place because many stars live there and raise their families there. I have stated this before but Hollywood can influence someone to become better physically and mentally. People may eat healthier and become more productive with their everyday activities.