Foursquare API
Jupyter Notebook on local machine /IBM Watson Account
Data
The Wikipedia page (https://en.wikipedia.org/wiki/List_of_United_States_cities_by_population) was scraped using the BeautifulSoup library to build a pandas dataframe listing the cities, states, coordinates, area and population density. The dataframe was cleaned and processed appropriately.
The Wikipedia page (https://en.wikipedia.org/wiki/List_of_United_States_counties_by_per_capita_income) was scraped using the BeautifulSoup library to build a pandas dataframe listing the cities, states and percapita income. The dataframe was cleaned and processed appropriately.
The Foursquare API is then used to get the venues in each city of United States Based on the categories of each venue as decided by the CEO, we have assigned weights to each of them and got the city that has the maximum weight.
IntroductionBusiness Problem.pdf: Intros About Business Problem.
CAP.pdf, CAP.PPT: Depth Business Plan and metholody involved in PDF and PPT format respectively
The battle of Neighbourhoods - Final Report.ipynb : The actual source code of data
The battle of Neighbourhoods - Final Report.ipynb : Report and conclusion from the above data.
Based on the given constraints, the green circle is ideal place to build the cinema for mass footfalls and revenue
Big thanks to Coursera for this capstone project. I would also like to thank wikipedia for the data.