Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore: cleanup documentation #29552

Merged
merged 2 commits into from
Jul 11, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 4 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -169,10 +169,10 @@ how to set up a development environment.
- [Superset SIPs](https://github.com/orgs/apache/projects/170) - The status of Superset's SIPs (Superset Improvement Proposals) for both consensus and implementation status.

Understanding the Superset Points of View

- [The Case for Dataset-Centric Visualization](https://preset.io/blog/dataset-centric-visualization/)
- [Understanding the Superset Semantic Layer](https://preset.io/blog/understanding-superset-semantic-layer/)


- Getting Started with Superset
- [Superset in 2 Minutes using Docker Compose](https://superset.apache.org/docs/installation/docker-compose#installing-superset-locally-using-docker-compose)
- [Installing Database Drivers](https://superset.apache.org/docs/configuration/databases#installing-database-drivers)
Expand All @@ -190,8 +190,8 @@ Understanding the Superset Points of View
- [Mixed Time Series Charts](https://preset.io/events/mixed-time-series-visualization-in-superset-workshop/)
- [How the Bing Team Customized Superset for the Internal Self-Serve Data & Analytics Platform](https://preset.io/events/how-the-bing-team-heavily-customized-superset-for-their-internal-data/)
- [Live Demo: Visualizing MongoDB and Pinot Data using Trino](https://preset.io/events/2021-04-13-visualizing-mongodb-and-pinot-data-using-trino/)
- [Introduction to the Superset API](https://preset.io/events/introduction-to-the-superset-api/)
- [Building a Database Connector for Superset](https://preset.io/events/2021-02-16-building-a-database-connector-for-superset/)
- [Introduction to the Superset API](https://preset.io/events/introduction-to-the-superset-api/)
- [Building a Database Connector for Superset](https://preset.io/events/2021-02-16-building-a-database-connector-for-superset/)

- Visualizations
- [Creating Viz Plugins](https://superset.apache.org/docs/contributing/creating-viz-plugins/)
Expand All @@ -201,6 +201,7 @@ Understanding the Superset Points of View
- [Superset API](https://superset.apache.org/docs/rest-api)

## Repo Activity

<a href="https://next.ossinsight.io/widgets/official/compose-last-28-days-stats?repo_id=39464018" target="_blank" align="center">
<picture>
<source media="(prefers-color-scheme: dark)" srcset="https://next.ossinsight.io/widgets/official/compose-last-28-days-stats/thumbnail.png?repo_id=39464018&image_size=auto&color_scheme=dark" width="655" height="auto" />
Expand Down
14 changes: 10 additions & 4 deletions RELEASING/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -115,7 +115,7 @@ source set_release_env.sh 1.5.1rc1 [email protected]

The script will output the exported variables. Here's example for 1.5.1rc1:

```
```env
-------------------------------
Set Release env variables
SUPERSET_VERSION=1.5.1
Expand Down Expand Up @@ -264,13 +264,13 @@ python changelog.py --previous_version 1.5.0 --current_version ${SUPERSET_GITHUB

Finally, bump the version number on `superset-frontend/package.json` (replace with whichever version is being released excluding the RC version):

```
```json
"version": "0.38.0"
```

Commit the change with the version number, then git tag the version with the release candidate and push to the branch:

```
```bash
# add changed files and commit
git add ...
git commit ...
Expand Down Expand Up @@ -366,7 +366,7 @@ The script will interactively ask for extra information needed to fill out the e
voting description, it will generate a passing, non passing or non conclusive email.
Here's an example:

```
```text
A List of people with +1 binding vote (ex: Max,Grace,Krist): Daniel,Alan,Max,Grace
A List of people with +1 non binding vote (ex: Ville): Ville
A List of people with -1 vote (ex: John):
Expand Down Expand Up @@ -516,16 +516,22 @@ reference), and whether to force the `latest` Docker tag on the
generated images.

### Npm Release

You might want to publish the latest @superset-ui release to npm

```bash
cd superset/superset-frontend
```

An automated GitHub action will run and generate a new tag, which will contain a version number provided as a parameter.

```bash
export GH_TOKEN={GITHUB_TOKEN}
npx lerna version {VERSION} --conventional-commits --create-release github --no-private --yes --message {COMMIT_MESSAGE}
```

This action will publish the specified version to npm registry.

```bash
npx lerna publish from-package --yes
```
3 changes: 2 additions & 1 deletion docs/docs/contributing/howtos.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -434,7 +434,7 @@ To debug Flask running in POD inside a kubernetes cluster, you'll need to make s
add: ["SYS_PTRACE"]
```

See (set capabilities for a container)[https://kubernetes.io/docs/tasks/configure-pod-container/security-context/#set-capabilities-for-a-container] for more details.
See [set capabilities for a container](https://kubernetes.io/docs/tasks/configure-pod-container/security-context/#set-capabilities-for-a-container) for more details.

Once the pod is running as root and has the `SYS_PTRACE` capability it will be able to debug the Flask app.

Expand Down Expand Up @@ -590,6 +590,7 @@ Finally, for the translations to take effect we need to compile translation cata
binary MO files for the backend using `pybabel`.

```bash
# inside the project root
pybabel compile -d superset/translations
```

Expand Down
26 changes: 17 additions & 9 deletions scripts/tests/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,48 +26,56 @@ You can use a different DB backend by defining `SUPERSET__SQLALCHEMY_DATABASE_UR

This script will not install any dependencies for you, so you must be on an already set virtualenv

## Use:
## Usage

To show all supported switches:
```$bash

```bash
scripts/tests/run.sh --help
```

From the superset repo root directory:

- Example run all tests:
```$bash

```bash
scripts/tests/run.sh
```

- Example run a single test module:
```$bash

```bash
scripts/tests/run.sh --module tests/charts/api_tests.py
```

- Example run a single test:
```$bash

```bash
scripts/tests/run.sh --module tests/charts/api_tests.py::TestChartApi::test_get_charts
```

- Example run a single test, without any init procedures. Init procedures include:
resetting test database, db upgrade, superset init, loading example data. If your tests
are idempotent, after the first run, subsequent runs are really fast
```$bash

```bash
scripts/tests/run.sh --module tests/charts/api_tests.py::TestChartApi::test_get_charts --no-init
```

- Example for not recreating the test DB (will still run all the tests init procedures)
```$bash

```bash
scripts/tests/run.sh --module tests/charts/api_tests.py::TestChartApi::test_get_charts --no-reset-db
```

- Example for not running tests just initialize the test DB (drop/create, upgrade and load examples)
```$bash

```bash
scripts/tests/run.sh --no-tests
```

- Example for just resetting the tests DB
```$bash

```bash
scripts/tests/run.sh --reset-db --no-tests
```
22 changes: 14 additions & 8 deletions superset-websocket/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -56,7 +56,8 @@ In addition to periodic socket connection cleanup, the internal _channels_ regis
## Install

Install dependencies:
```

```bash
npm ci
```

Expand All @@ -71,12 +72,14 @@ Configuration via environment variables is also supported which can be helpful i
Configure the Superset Flask app to enable global async queries (in `superset_config.py`):

Enable the `GLOBAL_ASYNC_QUERIES` feature flag:
```

```python
"GLOBAL_ASYNC_QUERIES": True
```

Configure the following Superset values:
```

```python
GLOBAL_ASYNC_QUERIES_TRANSPORT = "ws"
GLOBAL_ASYNC_QUERIES_WEBSOCKET_URL = "ws://<host>:<port>/"
```
Expand All @@ -86,7 +89,8 @@ Note that the WebSocket server must be run on the same hostname (different port)
Note also that `localhost` and `127.0.0.1` are not considered the same host. For example, if you're pointing your browser to `localhost:<port>` for Superset, then the WebSocket url will need to be configured as `localhost:<port>`.

The following config values must contain the same values in both the Flask app config and `config.json`:
```

```text
GLOBAL_ASYNC_QUERIES_REDIS_CONFIG
GLOBAL_ASYNC_QUERIES_REDIS_STREAM_PREFIX
GLOBAL_ASYNC_QUERIES_JWT_COOKIE_NAME
Expand Down Expand Up @@ -114,26 +118,28 @@ The application is tracking a couple of metrics with `statsd` using the [hot-sho
## Running

Running locally via dev server:
```

```bash
npm run dev-server
```

Running in production:
```

```bash
npm run build && npm start
```

## Health check

The WebSocket server supports health checks via one of:

```
```text
GET /health
```

OR

```
```text
HEAD /health
```

Expand Down
6 changes: 5 additions & 1 deletion superset-websocket/utils/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,16 +17,20 @@ specific language governing permissions and limitations
under the License.
-->
# Test & development utilities

The files provided here are for testing and development only, and are not required to run the WebSocket server application.

## Test client application

The Express web application in `client-ws-app` is provided for testing the WebSocket server. See `client-ws-app/README.md` for details.

## Load testing script

The `loadtest.js` script is provided to populate the Redis streams with event data.

### Running
```

```bash
node loadtest.js
```

Expand Down
8 changes: 6 additions & 2 deletions superset-websocket/utils/client-ws-app/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,17 +17,21 @@ specific language governing permissions and limitations
under the License.
-->
# Test client application

This Express web application is provided for testing the WebSocket server. It is not required for running the server application, and is provided here for testing and development purposes only.

## Running

First, start the WebSocket server:
```

```bash
cd ..
npm run dev-server
```

Then run the client application:
```

```bash
cd client-ws-app
npm install
npm start
Expand Down
7 changes: 4 additions & 3 deletions superset/db_engine_specs/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -175,6 +175,7 @@ FROM
GROUP BY
UPPER(country_of_origin)
```

### `time_groupby_inline = False`

In theory this attribute should be used to omit time filters from the self-joins. When the attribute is false the time attribute will be present in the subquery used to compute limited series, eg:
Expand Down Expand Up @@ -415,21 +416,21 @@ DB engine specs should implement a class method called `get_function_names` that

Superset does a good job in keeping credentials secure. When you add a database with a password, for example:

```
```text
postgresql://admin:[email protected]:5432/db
```

The password is sent over the network only when the database is created. When you edit the database later, Superset will return this as the SQLAlchemy URI:

```
```text
postgresql://admin:[email protected]:5432/db
```

The password will be masked in the API response; it's not just masked in the browser UI. This is done in order to avoid sending the password unnecessarily over the network. Also, if a non-admin user has access to the API response, they won't be able to know the database password.

When the database is edited, the Superset backend is smart enough to replace the masked password with the actual password, unless the password has changed. That is, if you change the database in the URI from `db` to `db2` the SQLAlchemy URI will be stored in the backend as:

```
```text
postgresql://admin:[email protected]:5432/db2
```

Expand Down
Loading