Skip to content

Commit 240c232

Browse files
committed
resubmission
1 parent 9006d96 commit 240c232

12 files changed

+140
-72
lines changed

DESCRIPTION

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
Package: cloudfs
22
Title: Streamlined Interface to Interact with Cloud Storage Platforms
3-
Version: 0.1.1
3+
Version: 0.1.2
44
Authors@R: c(
55
person("Iaroslav", "Domin", email = "[email protected]", role = c("aut", "cre")),
66
person("Stefan", "Musch", email = "[email protected]", role = c("aut")),

NEWS.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,3 @@
1-
# cloudfs 0.1.1
1+
# cloudfs 0.1.2
22

33
* Initial version.

R/drive_transfer.R

Lines changed: 6 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -59,9 +59,12 @@ cloud_drive_upload <- function(file, root = NULL) {
5959
#' @return Invisibly returns `NULL` after successfully downloading the file.
6060
#'
6161
#' @examplesIf interactive()
62-
#' # downloads data/demo.csv from project's Google Drive folder
63-
#' # (provided it exists) and saves it to local 'data' folder
64-
#' cloud_drive_download("data/demo.csv")
62+
#' # downloads toy_data/demo.csv from project's Google Drive folder
63+
#' # (provided it exists) and saves it to local 'toy_data' folder
64+
#' cloud_drive_download("toy_data/demo.csv")
65+
#'
66+
#' # clean up
67+
#' unlink("toy_data", recursive = TRUE)
6568
#'
6669
#' @export
6770
cloud_drive_download <- function(file, root = NULL) {

R/drive_transfer_bulk.R

Lines changed: 5 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -167,12 +167,15 @@ cloud_drive_upload_bulk <- function(content, quiet = FALSE, root = NULL) {
167167
#' @return Invisibly returns the input `content` dataframe.
168168
#'
169169
#' @examplesIf interactive()
170-
#' # provided there's a folder called "data" in the root of your project's
170+
#' # provided there's a folder called "toy_data" in the root of your project's
171171
#' # Google Drive folder, and this folder contains "csv" files
172-
#' cloud_drive_ls("data") |>
172+
#' cloud_drive_ls("toy_data") |>
173173
#' filter(type == "csv") |>
174174
#' cloud_drive_download_bulk()
175175
#'
176+
#' # clean up
177+
#' unlink("toy_data", recursive = TRUE)
178+
#'
176179
#' @export
177180
cloud_drive_download_bulk <- function(content, quiet = FALSE) {
178181
cont <- cloud_drive_prep_bulk(content, what = "download", quiet = quiet)

R/s3_transfer.R

Lines changed: 6 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -57,9 +57,12 @@ cloud_s3_upload <- function(file, root = NULL) {
5757
#' @return Invisibly returns `NULL` after successfully downloading the file.
5858
#'
5959
#' @examplesIf interactive()
60-
#' # downloads data/demo.csv from project's S3 folder (provided it exists)
61-
#' # and saves it to local 'data' folder
62-
#' cloud_s3_download("data/demo.csv")
60+
#' # downloads toy_data/demo.csv from project's S3 folder (provided it exists)
61+
#' # and saves it to local 'toy_data' folder
62+
#' cloud_s3_download("toy_data/demo.csv")
63+
#'
64+
#' # clean up
65+
#' unlink("toy_data", recursive = TRUE)
6366
#'
6467
#' @export
6568
cloud_s3_download <- function(file, root = NULL) {

R/s3_transfer_bulk.R

Lines changed: 5 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -131,12 +131,15 @@ cloud_s3_upload_bulk <- function(content, quiet = FALSE, root = NULL) {
131131
#' @return Invisibly returns the input `content` dataframe.
132132
#'
133133
#' @examplesIf interactive()
134-
#' # provided there's a folder called "data" in the root of your project's
134+
#' # provided there's a folder called "toy_data" in the root of your project's
135135
#' # S3 folder, and this folder contains "csv" files
136-
#' cloud_s3_ls("data") |>
136+
#' cloud_s3_ls("toy_data") |>
137137
#' filter(type == "csv") |>
138138
#' cloud_s3_download_bulk()
139139
#'
140+
#' # clean up
141+
#' unlink("toy_data", recursive = TRUE)
142+
#'
140143
#' @export
141144
cloud_s3_download_bulk <- function(content, quiet = FALSE, root = NULL) {
142145
check_string(root, alt_null = TRUE)

cran-comments.md

Lines changed: 92 additions & 48 deletions
Original file line numberDiff line numberDiff line change
@@ -1,52 +1,96 @@
11
## Resubmission
2-
This is a resubmission. In this version I have:
3-
4-
* **Updated the `Description` field in the `DESCRIPTION` file** to:
5-
- Place software names ('Google Drive' and 'Amazon S3') in single quotes as
6-
per CRAN guidelines.
7-
- Add links to the respective web services for 'Google Drive' and 'Amazon S3'
8-
for clarity and compliance.
9-
10-
* **Removed examples from unexported functions.**
11-
12-
In response to the feedback, I have made the following changes to minimize the
13-
use of the `\dontrun` tag:
14-
15-
* **Examples Updated to Execute Safely**:
16-
- In `cloud_read_excel.Rd` and `cloud_get_roots.Rd`, I introduced examples
17-
that can be safely executed and removed the `\dontrun` tags.
18-
19-
* **Limited Use of `\dontrun`**:
20-
- For `cloud_local_ls.Rd`, I retained the `\dontrun` tag for only one example.
21-
22-
* **Conditional Execution with `\dontshow`**:
23-
The majority of the functions in the package require the user to associate the
24-
project directory with a cloud storage, a process that involves inserting a
25-
URL of a 'Google Drive' or an 'Amazon S3' folder into the console. For these
26-
functions, I wrapped the examples in `\dontshow` conditional on
27-
`interactive()`. The affected files include:
28-
- `cloud_drive_attach.Rd`
29-
- `cloud_drive_browse.Rd`
30-
- `cloud_drive_ls.Rd`
31-
- `cloud_drive_upload.Rd`
32-
- `cloud_drive_download.Rd`
33-
- `cloud_drive_write.Rd`
34-
- `cloud_drive_read.Rd`
35-
- `cloud_drive_upload_bulk.Rd`
36-
- `cloud_drive_download_bulk.Rd`
37-
- `cloud_drive_write_bulk.Rd`
38-
- `cloud_drive_read_bulk.Rd`
39-
- `cloud_drive_spreadsheet_autofit.Rd`
40-
- `cloud_s3_attach.Rd`
41-
- `cloud_s3_browse.Rd`
42-
- `cloud_s3_upload.Rd`
43-
- `cloud_s3_download.Rd`
44-
- `cloud_s3_read.Rd`
45-
- `cloud_s3_write.Rd`
46-
- `cloud_s3_upload_bulk.Rd`
47-
- `cloud_s3_download_bulk.Rd`
48-
- `cloud_s3_write_bulk.Rd`
49-
- `cloud_s3_read_bulk.Rd`
2+
This is a resubmission. Below are my responses to the feedback received in the
3+
previous review.
4+
5+
### User options
6+
7+
```
8+
Please always make sure to reset to user's options(), working directory or par() after you changed it in examples and vignettes and demos. --> inst/doc/cloudfs.R
9+
e.g.:
10+
old <- options(digits = 3)
11+
...
12+
options(old)
13+
```
14+
15+
I've addressed this by removing the `options(width = 150)` command from the
16+
beginning of the cloudfs.Rmd vignette.
17+
18+
### User Space Integrity
19+
20+
```
21+
Please ensure that your functions do not write by default or in your
22+
examples/vignettes/tests in the user's home filespace (including the package
23+
directory and getwd()). This is not allowed by CRAN policies.
24+
Please omit any default path in writing functions. In your
25+
examples/vignettes/tests you can write to tempdir().
26+
e.g.: man/cloud_drive_spreadsheet_autofit.Rd ; man/cloud_drive_upload.Rd ; ...
27+
```
28+
29+
I understand the importance of adhering to CRAN policies and ensuring that there
30+
are no unintended consequences for the users of the package. However, I have
31+
chosen not to make the suggested modifications, and I'd like to explain the
32+
rationale behind this decision and why in my opinion the package does not
33+
violate the policies.
34+
35+
One of the key features of the package is that it enables the use of concise
36+
relative paths for both the current working directory and associated cloud
37+
project folders. For instance, consider the task of uploading a local file,
38+
"models/glm.rds", to a project's S3 folder. Using `aws.s3`, the code would be:
39+
40+
```R
41+
aws.s3::put_object(
42+
file = "models/glm.rds",
43+
bucket = "project-data",
44+
object = "project-1/models/glm.rds"
45+
)
46+
```
47+
48+
With `cloudfs`, it can be achieved with a significantly simpler syntax:
49+
50+
```R
51+
cloud_s3_upload("models/glm.rds")
52+
```
53+
54+
Applying `cloud_s3_upload()` to a file located in a temporary folder goes
55+
against its design intent. Its main objective is to upload files while mirroring
56+
the folder structure between the current directory and the project's S3.
57+
Demonstrating this with a temp folder file would misrepresent the function's
58+
typical application.
59+
60+
That being said I've taken comprehensive measures to ensure no accidental or
61+
default file writing occurs in the current working directory:
62+
63+
- **Initial Setup**: Most functions, including all `*read*`, `*write*`,
64+
`*upload*`, `*download*` require users to link their project directory with
65+
cloud storage during the package's inaugural use. This entails obtaining
66+
explicit user consent.
67+
68+
- **dontshow**: Consequently, examples where this linkage would activate are
69+
wrapped in `\dontshow` conditional on `interactive()`.
70+
71+
- **Read Functions**: These initially pull files from the cloud to a temp folder
72+
for reading, leaving the working directory untouched. In examples, the working
73+
directory is also untouched.
74+
75+
- **Write Functions**: Files are first created in a temp folder, then sent to
76+
the cloud. The working directory remains untouched.
77+
78+
- **Download Functions**: These do pull files into the working directory, but
79+
this is their primary purpose and they cannot write anywhere outside of it.
80+
Also, in examples (shielded with `\dontshow`), I've added code to remove the
81+
donloaded files.
82+
83+
- **Upload Functions**: In examples, files are generated files in the working
84+
directory for uploading purposes. Still, cleanup code ensures their removal
85+
afterward.
86+
87+
- **Vignettes**: Chunks using `cloudfs` functions aren't executed; they're all
88+
tagged with `eval=FALSE`.
89+
90+
- **Tests**: These only operate when Google Drive or S3 tokens are available,
91+
excluding execution on CRAN. When testing, I use temporary folders for project
92+
creation and employ `withr::with_dir` to execute `cloudfs` code — a strategy
93+
suitable for testing but not for example clarity.
5094

5195
## R CMD check results
5296

man/cloud_drive_download.Rd

Lines changed: 6 additions & 3 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

man/cloud_drive_download_bulk.Rd

Lines changed: 5 additions & 2 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

man/cloud_s3_download.Rd

Lines changed: 6 additions & 3 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

0 commit comments

Comments
 (0)