I was wondering that when I attend college will I have to major in journalism to become a journalist. Say I want to write for a newspaper, magazine, or website, do I necessarily have to have a journalism degree? Could I just major in American Studies or American history and minor in journalism? Or do I need to major in journalism?