Skip to content

Commit

Permalink
Merge pull request #49 from titipata/revise-backend
Browse files Browse the repository at this point in the history
Revise backend
  • Loading branch information
bluenex authored Sep 20, 2019
2 parents f415194 + e6144aa commit abae094
Show file tree
Hide file tree
Showing 13 changed files with 391 additions and 460 deletions.
8 changes: 4 additions & 4 deletions backend/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@

## Fetch events

`fetch_events.py` contains functions to fetch Penn events. We can run GROBID and fetch to update data in `data/events.json` file
`fetch_events.py` contains functions to fetch Penn events. We can run GROBID and fetch to update data in `data/events.json` file
as follows

```sh
Expand All @@ -20,12 +20,12 @@ python cron_fetch_events.py
```


## Running Flask API locally
## Running Hug API locally

Start Flask API by running
Start Hug API by running:

```sh
python api.py
hug -f hug_api.py -p 8888
```


Expand Down
84 changes: 0 additions & 84 deletions backend/api.py

This file was deleted.

57 changes: 57 additions & 0 deletions backend/hug_api.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,57 @@
import hug
import json
import numpy as np
from datetime import datetime
import dateutil
from scipy.spatial.distance import cosine

# enable CORS
api = hug.API(__name__)
api.http.add_middleware(hug.middleware.CORSMiddleware(api))


path_data, path_vector = 'data/events.json', 'data/events_vector.json'
event_vectors = json.load(open(path_vector, 'r'))
events = json.load(open(path_data, 'r'))
event_vectors_map = {e['event_index']: e['event_vector']
for e in event_vectors}


def get_future_event(date):
"""
Function return True if the event happens after now
"""
try:
if dateutil.parser.parse(date) > datetime.now():
return True
else:
return False
except:
return False


@hug.post("/recommendations")
def recommendations(body):
"""
Suggest event from a given comma separated indices.
Body is sent as JSON from the frontend with data as a value of a key 'payload':
{
"payload": data
}
The body is then passed as an argument to this function, as a dictionary.
"""
event_indices = body['payload']
pref_indices = [int(event_idx) for event_idx in event_indices]
pref_vector = np.mean([np.array(event_vectors_map[idx])
for idx in pref_indices], axis=0)
# get indices of event happens after current time
future_event_indices = [e['event_index'] for e in
filter(lambda r: get_future_event(r['date_dt']), events)]

# rank indices by cosine distance and get indices
rank_indices = np.argsort([cosine(pref_vector, event_vectors_map[idx])
for idx in future_event_indices])[0:15]
indices_recommendation = [future_event_indices[i] for i in rank_indices]
return json.dumps(indices_recommendation)
104 changes: 0 additions & 104 deletions backend/ipynb/regex_datetime.ipynb

This file was deleted.

Loading

0 comments on commit abae094

Please sign in to comment.