As others have said, it entirely depends in what software you use and what you want to do. For straight photographic viewing and editing - no you don't need a fancy graphics card. The built in graphics (typically from Intel) that come as standard on most motherboards will do just fine. I would say for photography, your priorities should be 1) lots of memory and 2) a multi-core processor. Photographic editors use lots of memory! Photo are stored internally as uncompressed bitmaps and, with a modern DSLR producing something like 6000x2000 pixels at 16 bits per channel, that takes an awful lot of RAM and many operations may make one or more copies in memory whilst they are working. Also, quite a lot of operations - focus stacking and panorama stitching would be good examples - make very good use of parallel processing on multiple CPUs and the completion time will scale almost linearly with the number of CPUs, and even if you don't do this sort of stuff, editing you photos whilst listening to music or not seeing you current operation stutter when an email arrives will benefit from a multicore CPU.
Gaming is a different issue entirely and uses graphics in a completely different way - although again modern games generally benefit from lots of memory and multi-core CPUs. If you go for a GTX 1050 you will find that it is hardly used by photo editing and viewing applications (watch the core temperatures and fan speeds, for example, to see how heavily it is loaded).