Skip to content

Commit

Permalink
Merge pull request #3309 from IQSS/4.5-export-harvest-docs
Browse files Browse the repository at this point in the history
A quick fix to move the exportAll API calls under /api/admin.
  • Loading branch information
kcondon authored Aug 24, 2016
2 parents 6a10c70 + 6f915f4 commit 911b4f3
Show file tree
Hide file tree
Showing 3 changed files with 62 additions and 28 deletions.
4 changes: 2 additions & 2 deletions doc/sphinx-guides/source/admin/metadataexport.rst
Original file line number Diff line number Diff line change
Expand Up @@ -15,9 +15,9 @@ Batch exports through the API

In addition to the automated exports, a Dataverse admin can start a batch job through the API. The following 2 API calls are provided:

/api/datasets/exportAll?key=...
/api/admin/metadata/exportAll

/api/datasets/reExportAll?key=...
/api/admin/metadata/reExportAll

The former will attempt to export all the published, local (non-harvested) datasets that haven't been exported yet.
The latter will *force* a re-export of every published, local dataset, regardless of whether it has already been exported or not.
Expand Down
32 changes: 6 additions & 26 deletions src/main/java/edu/harvard/iq/dataverse/api/Datasets.java
Original file line number Diff line number Diff line change
Expand Up @@ -221,6 +221,12 @@ public Response createDataset(String jsonBody) {
return okResponse(jsonbuilder.add("latestVersion", json(importedDataset.getLatestVersion())));
} */

// TODO:
// This API call should, ideally, call findUserOrDie() and the GetDatasetCommand
// to obtain the dataset that we are trying to export - which would handle
// Auth in the process... For now, Auth isn't necessary - since export ONLY
// WORKS on published datasets, which are open to the world. -- L.A. 4.5

@GET
@Path("/export")
@Produces({"application/xml", "application/json"})
Expand Down Expand Up @@ -260,32 +266,6 @@ public Response exportDataset(@QueryParam("persistentId") String persistentId, @
}
}

// The following 2 commands start export all jobs in the background,
// asynchronously.
// (These API calls should probably not be here;
// May be under "/admin" somewhere?)
// exportAll will attempt to go through all the published, local
// datasets *that haven't been exported yet* - which is determined by
// checking the lastexporttime value of the dataset; if it's null, or < the last
// publication date = "unexported" - and export them.
@GET
@Path("/exportAll")
@Produces("application/json")
public Response exportAll() {
datasetService.exportAllAsync();
return this.accepted();
}

// reExportAll will FORCE A FULL REEXPORT on every published, local
// dataset, regardless of the lastexporttime value.
@GET
@Path("/reExportAll")
@Produces("application/json")
public Response reExportAll() {
datasetService.reExportAllAsync();
return this.accepted();
}

@DELETE
@Path("{id}")
public Response deleteDataset( @PathParam("id") String id) {
Expand Down
54 changes: 54 additions & 0 deletions src/main/java/edu/harvard/iq/dataverse/api/Metadata.java
Original file line number Diff line number Diff line change
@@ -0,0 +1,54 @@
/*
* To change this license header, choose License Headers in Project Properties.
* To change this template file, choose Tools | Templates
* and open the template in the editor.
*/
package edu.harvard.iq.dataverse.api;

import edu.harvard.iq.dataverse.DatasetServiceBean;
import java.util.logging.Logger;
import javax.ejb.EJB;
import javax.ws.rs.GET;
import javax.ws.rs.Path;
import javax.ws.rs.Produces;
import javax.ws.rs.core.Response;

/**
*
* @author Leonid Andreev
*
*/

@Path("admin/metadata")
public class Metadata extends AbstractApiBean {
private static final Logger logger = Logger.getLogger(Metadata.class.getName());

@EJB
DatasetServiceBean datasetService;

// The following 2 commands start export all jobs in the background,
// asynchronously.
// (These API calls should probably not be here;
// May be under "/admin" somewhere?)
// exportAll will attempt to go through all the published, local
// datasets *that haven't been exported yet* - which is determined by
// checking the lastexporttime value of the dataset; if it's null, or < the last
// publication date = "unexported" - and export them.
@GET
@Path("/exportAll")
@Produces("application/json")
public Response exportAll() {
datasetService.exportAllAsync();
return this.accepted();
}

// reExportAll will FORCE A FULL REEXPORT on every published, local
// dataset, regardless of the lastexporttime value.
@GET
@Path("/reExportAll")
@Produces("application/json")
public Response reExportAll() {
datasetService.reExportAllAsync();
return this.accepted();
}
}

0 comments on commit 911b4f3

Please sign in to comment.