Skip to content

Add Pipfile and Pre-commit hook #6

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 3 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
21 changes: 21 additions & 0 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
repos:
- repo: https://github.com/psf/black
rev: 23.3.0
hooks:
- id: black
language_version: python3.11
- repo: https://github.com/pycqa/isort
rev: 5.12.0 # Use the ref you want to point at
hooks:
- id: isort
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.0.1
hooks:
- id: trailing-whitespace
- id: end-of-file-fixer
- id: check-yaml
- id: check-added-large-files
- repo: https://github.com/hadialqattan/pycln
rev: v2.1.5
hooks:
- id: pycln
34 changes: 34 additions & 0 deletions Pipfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,34 @@
[[source]]
url = "https://pypi.org/simple"
verify_ssl = true
name = "pypi"

[packages]
streamlit = ">=1.3.0"
pypdf2 = ">=1.26.0"
pypdf = ">=0.0.1"
langchain = ">=0.0.188"
openai = ">=0.27.0"
faiss-cpu = ">=1.7.1.post2"
pinecone-client = ">=0.0.8"
pandas = ">=1.3.4"
tiktoken = ">=0.0.1"
gsheetsdb = ">=0.1.13.1"
shillelagh = ">=1.2.4"
gradio-tools = ">=0.0.9"
gspread = ">=5.7.2"
config = ">=0.5.1"
google-api-python-client = ">=2.86.0"
pymupdf = ">=1.22.3"
python-dotenv = ">=1.0.0"
colorama = ">=0.4.6"
bs4 = ">=0.0.1"
black = "*"
isort = "*"
pycln = "*"
pre-commit = "*"

[dev-packages]

[requires]
python_version = "3.11"
2,184 changes: 2,184 additions & 0 deletions Pipfile.lock

Large diffs are not rendered by default.

7 changes: 4 additions & 3 deletions docs/contribution_guidelines.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,9 +25,10 @@ Follow these steps to contribute code through a pull request:
3. Create a new branch for your changes: `git checkout -b my-branch`.
4. Make your desired changes or additions to the codebase.
5. Test your changes to ensure they work as intended.
6. Commit your changes: `git commit -m "Add feature or fix"`.
7. Push your changes to your forked repository: `git push origin my-branch`.
8. Open a pull request from your forked repository to the main Nuggt repository.
6. Run the pre-commit hook to ensure your code adheres to the project's coding conventions: `pre-commit run --all-files`.
7. Commit your changes: `git commit -m "Add feature or fix"`.
8. Push your changes to your forked repository: `git push origin my-branch`.
9. Open a pull request from your forked repository to the main Nuggt repository.

Please ensure that your code adheres to the project's coding conventions. It's also helpful to include a clear and concise description of your changes in the pull request.

Expand Down
25 changes: 18 additions & 7 deletions nuggt-release/browse.py
Original file line number Diff line number Diff line change
@@ -1,25 +1,37 @@
#This file is taken from https://github.com/Significant-Gravitas/Auto-GPT
# This file is taken from https://github.com/Significant-Gravitas/Auto-GPT

import requests
from bs4 import BeautifulSoup


# Define and check for local file address prefixes
def check_local_file_access(url):
local_prefixes = ['file:///', 'file://localhost', 'http://localhost', 'https://localhost']
local_prefixes = [
"file:///",
"file://localhost",
"http://localhost",
"https://localhost",
]
return any(url.startswith(prefix) for prefix in local_prefixes)


def scrape_text(url):
"""Scrape text from a webpage"""
# Most basic check if the URL is valid:
if not url.startswith('http'):
if not url.startswith("http"):
return "Error: Invalid URL"

# Restrict access to local files
if check_local_file_access(url):
return "Error: Access to local files is restricted"

try:
response = requests.get(url, headers={"User-Agent":"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_4) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/83.0.4103.97 Safari/537.36"})
response = requests.get(
url,
headers={
"User-Agent": "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_4) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/83.0.4103.97 Safari/537.36"
},
)
except requests.exceptions.RequestException as e:
return "Error: " + str(e)

Expand All @@ -35,7 +47,6 @@ def scrape_text(url):
text = soup.get_text()
lines = (line.strip() for line in text.splitlines())
chunks = (phrase.strip() for line in lines for phrase in line.split(" "))
text = '\n'.join(chunk for chunk in chunks if chunk)
text = "\n".join(chunk for chunk in chunks if chunk)

return text

Loading