There are multiple tasks to tackle this problem. You might want to divide the responsibilities up among different tools at different stages, say using Python or R to do some data prep and enrichment work up front, followed by Tableau doing some interactive visualization further along in the process.
In addition to the tools you mention, I recommend spending some time getting familiar with spatial databases such as the open source PostGIS. Oracle and Microsoft also have spatial extensions to their databases. With a spatial database you can store spatial data such as points, polygons and paths along with relational data, and they have spatial functions built-in as well.
The main tasks to consider:
Geocoding - convert addresses to spatial coordinates. You're going to want to use a geocoding web service for this, probably calling it from Python or R. This is something you'd want to do once up front when preparing new data for analysis and then store the result for reuse by later stages.
Computing voronoi polygons. This algorithm tiles the plane with polygons so that all the points within a polygon are closest a particular point. So if you want to know which areas are closest to each of 5 hospitals, a voronoi diagram will divide the map into non-overlapping regions, with one polygon for each hospital containing the areas that are closer to that hospital than any other. So if your hospitals don't move, you should compute the voronoi polygons once and store the result for reuse. There are libraries in Python and R that can do this. PostGIS has a function for computing voronoi polygons also.
Hit testing. This step tests whether a point falls within a polygon. You can either perform hit testing by calling dedicated R or Python functions, but I recommend using a spatial database like PostGIS instead. Then you can perform hit testing, either by calling a function or by using the INTERSECTS spatial join operator. You can optimize performance by building a spatial index for your polygons after loading them into the database.
Presentation, Summarization. This is where Tableau is helpful. You can display spatial data like points and polygons, directly from a spatial database if you are using one, and also easily compute summary statistics like the percentages you mentioned.
The more recent versions of Tableau have even more support for spatial data and can compute distances and buffers although it may still be faster to push that work into the spatial database.
This point is probably obvious to many people, but just to be clear ... The approach described above performs well when the points you are measuring against, say the hospitals, have static locations so it is the worth the time to precompute the Voronoi polygons and to create any spatial indices - and also when you have many points that you wish to test. In those cases, the precomputed polygons and spatial indices can pay off with performance gains.
For smaller data sets, you can of course just compute the distance between every possible source and every possible destination and then choose the connection with the smallest distance. That simple, but brute force approach, gets less feasible as the number of data points grows, in which case its more worth the effort to go through the steps above.