Skip to content

Commit 6a2590d

Browse files
authored
docs: Just read docs and adjust small type (#254)
This PR only related to the polish
1 parent 579058e commit 6a2590d

File tree

16 files changed

+35
-29
lines changed

16 files changed

+35
-29
lines changed

docker/playground/filldb/filldb.sh

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@
1313
# See the License for the specific language governing permissions and
1414
# limitations under the License.
1515

16-
cd $TMP_DIR/postgresDBSamples/adventureworks
16+
cd $TMP_DIR/postgresDBSamples/adventureworks || exit
1717

1818
if ! psql -lqt -p 5432 -h playground-db -U postgres | cut -d \| -f 1 | grep -qw $ORIGINAL_DB_NAME; then
1919
psql -p 5432 -h playground-db -U postgres -c "CREATE DATABASE $ORIGINAL_DB_NAME;"

docs/architecture.md

Lines changed: 6 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,12 @@
22

33
## Introduction
44

5-
It is evident that the most appropriate approach for executing logical backup dumping and restoration is by leveraging the core PostgreSQL utilities, specifically `pg_dump` and `pg_restore`. Greenmask has been purposefully designed to align with PostgreSQL's native utilities, ensuring compatibility. Greenmask primarily handles data dumping operations independently and delegates the responsibilities of schema dumping and restoration to `pg_dump` and `pg_restore` respectively, maintaining seamless integration with PostgreSQL's standard tools.
5+
It is evident that the most appropriate approach for executing logical backup dumping
6+
and restoration is by leveraging the core PostgreSQL utilities, specifically `pg_dump` and `pg_restore`.
7+
Greenmask has been purposefully designed to align with PostgreSQL's native utilities, ensuring compatibility.
8+
Greenmask primarily handles data dumping operations independently and delegates
9+
the responsibilities of schema dumping and restoration to `pg_dump` and `pg_restore` respectively,
10+
maintaining seamless integration with PostgreSQL's standard tools.
611

712
## Backup process
813

docs/built_in_transformers/dynamic_parameters.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -117,7 +117,7 @@ constraints, you can use dynamic parameters in the `RandomDate` transformer:
117117
template: '{{ .GetValue | tsModify "18 years" | .EncodeValue }}' # (7)
118118
```
119119

120-
1. Firstly we generate the `RadnomDate` for birthdate column. The result of the transformation will used as the minimum
120+
1. Firstly we generate the `RadnomDate` for birthdate column. The result of the transformation will use as the minimum
121121
value for the next transformation for `hiredate` column.
122122
2. Apply the template for static parameter. It calculates the now date and subtracts `30` years from it. The result
123123
is `1994`. The function tsModify return not a raw data, but time.Time object. For getting the raw value suitable for

docs/built_in_transformers/standard_transformers/random_amount_with_currency.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -12,10 +12,10 @@ currencies.
1212

1313
## Description
1414

15-
This transformer automatically generates random financial amounts along with corresponding global currency codes (e. g.,
15+
This transformer automatically generates random financial amounts along with corresponding global currency codes (e.g.,
1616
`250.00 USD`, `300.00 EUR`), injecting them into the designated database column. It provides a straightforward solution
17-
for populating financial records with varied and realistic data, suitable for testing payment systems, data
18-
anonymization, and simulation of economic models.
17+
for populating financial records with varied and realistic data, suitable for testing payment systems, data anonymization,
18+
and simulation of economic models.
1919

2020
## Example: Populate the `payments` table with random amounts and currencies
2121

docs/built_in_transformers/standard_transformers/random_day_of_week.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -11,8 +11,8 @@ planning, or any scenario where the day of the week is relevant but the specific
1111

1212
## Description
1313

14-
Utilizing the `faker` library, the `RandomDayOfWeek` transformer generates names of days (e. g., Monday, Tuesday) at
15-
random. This transformer can be applied to any text or varchar column in a database, introducing variability and realism
14+
Utilizing the `faker` library, the `RandomDayOfWeek` transformer generates names of days (e.g., Monday, Tuesday) at random.
15+
This transformer can be applied to any text or varchar column in a database, introducing variability and realism
1616
into data sets that need to represent days of the week in a non-specific manner.
1717

1818
## Example: Populate random days of the week for the `work_schedule` table

docs/built_in_transformers/standard_transformers/random_timezone.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -12,9 +12,9 @@ timezone-related functionalities, or anonymizing real user timezone information
1212
## Description
1313

1414
Utilizing a comprehensive library or algorithm for generating timezone data, the `RandomTimezone` transformer provides
15-
random timezone strings (e. g., "America/New_York", "Europe/London") for database columns. This feature enables the
16-
creation of diverse and realistic datasets by simulating timezone information for user profiles, event timings, or any
17-
other data requiring timezone context.
15+
random timezone strings (e.g., "America/New_York", "Europe/London") for database columns.
16+
This feature enables the creation of diverse and realistic datasets by simulating timezone information for user profiles,
17+
event timings, or any other data requiring timezone context.
1818

1919
## Example: Populate random timezone strings for the `user_accounts` table
2020

docs/built_in_transformers/standard_transformers/random_url.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -12,8 +12,8 @@ input, or anonymizing real web addresses in datasets.
1212
## Description
1313

1414
Utilizing advanced algorithms or libraries for generating URL strings, the `RandomURL` transformer injects random,
15-
plausible URLs into the designated database column. Each generated URL is structured to include the protocol (e. g., "
16-
http://", "https://"), domain name, and path, offering a realistic range of web addresses for various applications.
15+
plausible URLs into the designated database column. Each generated URL is structured to include the protocol
16+
(e.g., "http://", "https://"), domain name, and path, offering a realistic range of web addresses for various applications.
1717

1818
## Example: Populate random URLs for the `webpages` table
1919

docs/commands/validate.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -30,7 +30,7 @@ All of those cases may be used for CI/CD pipelines to stop the process when some
3030
useful when `--schema` flag is used - this allows to avoid data leakage when schema changed.
3131

3232
You can use the `--table` flag multiple times to specify the tables you want to check. Tables can be written with
33-
or without schema names (e. g., `public.table_name` or `table_name`). If you specify multiple tables from different
33+
or without schema names (e.g., `public.table_name` or `table_name`). If you specify multiple tables from different
3434
schemas, an error will be thrown.
3535

3636
To start validation, use the following command:

docs/configuration.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -223,7 +223,7 @@ validate:
223223
```
224224
{ .annotate }
225225

226-
1. A list of tables to validate. If this list is not empty, the validation operation will only be performed for the specified tables. Tables can be written with or without the schema name (e. g., `"public.cart"` or `"orders"`).
226+
1. A list of tables to validate. If this list is not empty, the validation operation will only be performed for the specified tables. Tables can be written with or without the schema name (e.g., `"public.cart"` or `"orders"`).
227227
2. Specifies whether to perform data transformation for a limited set of rows. If set to `true`, data transformation will be performed, and the number of rows transformed will be limited to the value specified in the `rows_limit` parameter (default is `10`).
228228
3. Specifies whether to perform diff operations for the transformed data. If set to `true`, the validation process will **find the differences between the original and transformed data**. See more details in the [validate command documentation](commands/validate.md).
229229
4. Limits the number of rows to be transformed during validation. The default limit is `10` rows, but you can change it by modifying this parameter.

docs/database_subset.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -117,7 +117,7 @@ dump:
117117

118118
Greenmask supports polymorphic references. You can define a virtual reference for a table with polymorphic references
119119
using `polymorphic_exprs` attribute. The `polymorphic_exprs` attribute is a list of expressions that are used to make
120-
a polymorphic reference. For instance we might have a table `comments` that has polymorphic reference to `posts` and
120+
a polymorphic reference. For instance, we might have a table `comments` that has polymorphic reference to `posts` and
121121
`videos`. The table comments might have `commentable_id` and `commentable_type` columns. The `commentable_type` column
122122
contains the type of the table that is referenced by the `commentable_id` column. The example of the config:
123123

0 commit comments

Comments
 (0)