Skip to content

Commit

Permalink
Merge branch 'develop' into 2960-download-manager-help-msg
Browse files Browse the repository at this point in the history
  • Loading branch information
sekmiller committed Dec 19, 2018
2 parents 707b792 + 0ec3b73 commit 6f64acf
Show file tree
Hide file tree
Showing 48 changed files with 143 additions and 91 deletions.
4 changes: 3 additions & 1 deletion doc/sphinx-guides/source/developers/dev-environment.rst
Original file line number Diff line number Diff line change
Expand Up @@ -169,7 +169,9 @@ After the script has finished, you should be able to log into Dataverse with the
Configure Your Development Environment for Publishing
-----------------------------------------------------

In order to publish datasets, you must configure Dataverse with a username and password for a persistent ID provider. The installer configures your development environment to use DOIs (rather than Handles) for persistent IDs with DataCite's test server at https://mds.test.datacite.org as the provider. In order to publish datasets with this provider, you must email [email protected] and ask for a test account. Once you have your DataCite username and password, you must add them as JVM options (``doi.username`` and ``doi.password``) as described under "Persistent Identifiers and Publishing Datasets" in the :doc:`/installation/config` section of the Installation Guide.
Run the following command:

``curl http://localhost:8080/api/admin/settings/:DoiProvider -X PUT -d FAKE``

Next Steps
----------
Expand Down
2 changes: 1 addition & 1 deletion doc/sphinx-guides/source/installation/config.rst
Original file line number Diff line number Diff line change
Expand Up @@ -884,7 +884,7 @@ By default the footer says "Copyright © [YYYY]" but you can add text after the
:DoiProvider
++++++++++++

As of this writing "DataCite" and "EZID" are the only valid options. ``:DoiProvider`` is only needed if you are using DOI.
As of this writing "DataCite" and "EZID" are the only valid options for production installations. Developers are welcome to use "FAKE". ``:DoiProvider`` is only needed if you are using DOI.

``curl -X PUT -d DataCite http://localhost:8080/api/admin/settings/:DoiProvider``

Expand Down
2 changes: 1 addition & 1 deletion scripts/deploy/phoenix.dataverse.org/post
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ cd ../..
psql -U dvnapp dvndb -f scripts/database/reference_data.sql
psql -U dvnapp dvndb -f doc/sphinx-guides/source/_static/util/pg8-createsequence-prep.sql
psql -U dvnapp dvndb -f doc/sphinx-guides/source/_static/util/createsequence.sql
curl http://localhost:8080/api/admin/settings/:DoiProvider -X PUT -d DataCite
curl http://localhost:8080/api/admin/settings/:DoiProvider -X PUT -d FAKE
scripts/search/tests/publish-dataverse-root
git checkout scripts/api/data/dv-root.json
scripts/search/tests/grant-authusers-add-on-root
Expand Down
12 changes: 7 additions & 5 deletions src/main/java/Bundle.properties
Original file line number Diff line number Diff line change
Expand Up @@ -1184,7 +1184,7 @@ dataset.submitBtn=Submit for Review
dataset.disabledSubmittedBtn=Submitted for Review
dataset.submitMessage=You will not be able to make changes to this dataset while it is in review.
dataset.submit.success=Your dataset has been submitted for review.
dataset.inreview.infoMessage=\u2013 This dataset is currently under review prior to publication.
dataset.inreview.infoMessage=The draft version of this dataset is currently under review prior to publication.
dataset.submit.failure=Dataset Submission Failed - {0}
dataset.submit.failure.null=Can't submit for review. Dataset is null.
dataset.submit.failure.isReleased=Latest version of dataset is already released. Only draft versions can be submitted for review.
Expand Down Expand Up @@ -1220,6 +1220,7 @@ dataset.share.datasetShare=Share Dataset
dataset.share.datasetShare.tip=Share this dataset on your favorite social media networks.
dataset.share.datasetShare.shareText=View this dataset.
dataset.locked.message=Dataset Locked
dataset.locked.message.details=This dataset is locked until publication.
dataset.locked.inReview.message=Submitted for Review
dataset.publish.error=This dataset may not be published due to an error when contacting the <a href=\{1} target=\"_blank\"/> {0} </a> Service. Please try again.
dataset.publish.error.doi=This dataset may not be published because the DOI update failed.
Expand All @@ -1236,8 +1237,9 @@ dataset.compute.computeBatchListHeader=Compute Batch
dataset.compute.computeBatchRestricted=This dataset contains restricted files you may not compute on because you have not been granted access.
dataset.delete.error=Could not deaccession the dataset because the {0} update failed.
dataset.publish.worldMap.deleteConfirm=Please note that your data and map on WorldMap will be removed due to restricted file access changes in this dataset version which you are publishing. Do you want to continue?
dataset.publish.workflow.inprogress=Publish workflow in progress
dataset.pidRegister.workflow.inprogress=Register/update file persistent identifiers workflow in progress
dataset.publish.workflow.message=Publish in Progress
dataset.publish.workflow.inprogress=This dataset is locked until publication.
dataset.pidRegister.workflow.inprogress=This dataset is locked while the file persistent identifiers are being registered or updated.
dataset.versionUI.draft=Draft
dataset.versionUI.inReview=In Review
dataset.versionUI.unpublished=Unpublished
Expand Down Expand Up @@ -1401,8 +1403,8 @@ file.rsyncUpload.step2=Download this file upload script:
file.rsyncUpload.step2.downloadScriptButton=Download DCM Script
file.rsyncUpload.step3=Open a terminal window in the same directory you saved the script and run this command: <code>bash ./{0}</code>
file.rsyncUpload.step4=Follow the instructions in the script. It will ask for a full path (beginning with "/") to the directory containing your data. Note: this script will expire after 7 days.
file.rsyncUpload.inProgressMessage.summary=DCM File Upload
file.rsyncUpload.inProgressMessage.details=This dataset is locked until the data files have been transferred and verified.
file.rsyncUpload.inProgressMessage.summary=File Upload in Progress
file.rsyncUpload.inProgressMessage.details=This dataset is locked while the data files are being transferred and verified.
file.rsyncUpload.httpUploadDisabledDueToRsyncFileExisting=HTTP upload is disabled for this dataset because you have already uploaded files via rsync. If you would like to switch to HTTP upload, please contact {0}.
file.rsyncUpload.httpUploadDisabledDueToRsyncFileExistingAndPublished=HTTP upload is disabled for this dataset because you have already uploaded files via rsync and published the dataset.
file.rsyncUpload.rsyncUploadDisabledDueFileUploadedViaHttp=Upload with rsync + SSH is disabled for this dataset because you have already uploaded files via HTTP. If you would like to switch to rsync upload, then you must first remove all uploaded files from this dataset. Once this dataset is published, the chosen upload method is permanently locked in.
Expand Down
10 changes: 5 additions & 5 deletions src/main/java/edu/harvard/iq/dataverse/DatasetPage.java
Original file line number Diff line number Diff line change
Expand Up @@ -1552,10 +1552,10 @@ private String init(boolean initFull) {
}

// Various info messages, when the dataset is locked (for various reasons):
if (dataset.isLocked()) {
if (dataset.isLocked() && canUpdateDataset()) {
if (dataset.isLockedFor(DatasetLock.Reason.Workflow)) {
JH.addMessage(FacesMessage.SEVERITY_WARN, BundleUtil.getStringFromBundle("dataset.locked.message"),
BundleUtil.getStringFromBundle("dataset.publish.workflow.inprogress"));
BundleUtil.getStringFromBundle("dataset.locked.message.details"));
}
if (dataset.isLockedFor(DatasetLock.Reason.InReview)) {
JH.addMessage(FacesMessage.SEVERITY_WARN, BundleUtil.getStringFromBundle("dataset.locked.inReview.message"),
Expand All @@ -1573,8 +1573,8 @@ private String init(boolean initFull) {
datasetService.removeDatasetLocks(dataset.getId(), DatasetLock.Reason.pidRegister);
}*/
if (dataset.isLockedFor(DatasetLock.Reason.pidRegister)) {
JH.addMessage(FacesMessage.SEVERITY_WARN, BundleUtil.getStringFromBundle("dataset.pidRegister.workflow.inprogress"),
BundleUtil.getStringFromBundle("dataset.publish.workflow.inprogress"));
JH.addMessage(FacesMessage.SEVERITY_WARN, BundleUtil.getStringFromBundle("dataset.publish.workflow.message"),
BundleUtil.getStringFromBundle("dataset.pidRegister.workflow.inprogress"));
}
}

Expand Down Expand Up @@ -1943,7 +1943,7 @@ private String releaseDataset(boolean minor) {
if ( result.isCompleted() ) {
JsfHelper.addSuccessMessage(BundleUtil.getStringFromBundle("dataset.message.publishSuccess"));
} else {
JH.addMessage(FacesMessage.SEVERITY_WARN, BundleUtil.getStringFromBundle("dataset.locked.message"), BundleUtil.getStringFromBundle("dataset.publish.workflow.inprogress"));
JH.addMessage(FacesMessage.SEVERITY_WARN, BundleUtil.getStringFromBundle("dataset.locked.message"), BundleUtil.getStringFromBundle("dataset.locked.message.details"));
}

} catch (CommandException ex) {
Expand Down
13 changes: 11 additions & 2 deletions src/main/java/edu/harvard/iq/dataverse/EjbDataverseEngine.java
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,7 @@
import edu.harvard.iq.dataverse.engine.command.exception.CommandException;
import edu.harvard.iq.dataverse.engine.command.exception.PermissionException;
import edu.harvard.iq.dataverse.ingest.IngestServiceBean;
import edu.harvard.iq.dataverse.pidproviders.FakePidProviderServiceBean;
import edu.harvard.iq.dataverse.privateurl.PrivateUrlServiceBean;
import edu.harvard.iq.dataverse.search.IndexBatchServiceBean;
import edu.harvard.iq.dataverse.search.IndexServiceBean;
Expand Down Expand Up @@ -108,7 +109,10 @@ public class EjbDataverseEngine {

@EJB
DOIDataCiteServiceBean doiDataCite;


@EJB
FakePidProviderServiceBean fakePidProvider;

@EJB
HandlenetServiceBean handleNet;

Expand Down Expand Up @@ -372,7 +376,12 @@ public DOIEZIdServiceBean doiEZId() {
public DOIDataCiteServiceBean doiDataCite() {
return doiDataCite;
}


@Override
public FakePidProviderServiceBean fakePidProvider() {
return fakePidProvider;
}

@Override
public HandlenetServiceBean handleNet() {
return handleNet;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -78,6 +78,7 @@ class BeanDispatcher {
switch ( doiProvider ) {
case "EZID": return ctxt.doiEZId();
case "DataCite": return ctxt.doiDataCite();
case "FAKE": return ctxt.fakePidProvider();
default:
logger.log(Level.SEVERE, "Unknown doiProvider: {0}", doiProvider);
return null;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -518,13 +518,17 @@ private StoredObject initializeSwiftFileObject(boolean writeAccess, String auxIt
Properties p = getSwiftProperties();
swiftEndPoint = p.getProperty("swift.default.endpoint");

// Swift uses this to create pseudo-hierarchical folders
String swiftPseudoFolderPathSeparator = "/";

//swiftFolderPath = dataFile.getOwner().getDisplayName();
String swiftFolderPathSeparator = "-";
String authorityNoSlashes = owner.getAuthority().replace("/", swiftFolderPathSeparator);
swiftFolderPath = owner.getProtocolForFileStorage() + swiftFolderPathSeparator
+ authorityNoSlashes.replace(".", swiftFolderPathSeparator)
+ swiftFolderPathSeparator + owner.getIdentifierForFileStorage();
swiftFileName = storageIdentifier;
+ authorityNoSlashes.replace(".", swiftFolderPathSeparator);

swiftFileName = owner.getIdentifierForFileStorage() + swiftPseudoFolderPathSeparator
+ storageIdentifier;
//setSwiftContainerName(swiftFolderPath);
//swiftFileName = dataFile.getDisplayName();
//Storage Identifier is now updated after the object is uploaded on Swift.
Expand Down Expand Up @@ -569,10 +573,14 @@ private StoredObject initializeSwiftFileObject(boolean writeAccess, String auxIt
Properties p = getSwiftProperties();
swiftEndPoint = p.getProperty("swift.default.endpoint");
String swiftFolderPathSeparator = "-";

// Swift uses this to create pseudo-hierarchical folders
String swiftPseudoFolderPathSeparator = "/";

String authorityNoSlashes = dataset.getAuthorityForFileStorage().replace("/", swiftFolderPathSeparator);
swiftFolderPath = dataset.getProtocolForFileStorage() + swiftFolderPathSeparator +
authorityNoSlashes.replace(".", swiftFolderPathSeparator) +
swiftFolderPathSeparator + dataset.getIdentifierForFileStorage();
swiftPseudoFolderPathSeparator + dataset.getIdentifierForFileStorage();

swiftFileName = auxItemTag;
dvObject.setStorageIdentifier("swift://" + swiftEndPoint + ":" + swiftFolderPath);
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -31,6 +31,7 @@
import edu.harvard.iq.dataverse.datacapturemodule.DataCaptureModuleServiceBean;
import edu.harvard.iq.dataverse.engine.DataverseEngine;
import edu.harvard.iq.dataverse.ingest.IngestServiceBean;
import edu.harvard.iq.dataverse.pidproviders.FakePidProviderServiceBean;
import edu.harvard.iq.dataverse.privateurl.PrivateUrlServiceBean;
import edu.harvard.iq.dataverse.search.IndexBatchServiceBean;
import edu.harvard.iq.dataverse.search.SolrIndexServiceBean;
Expand Down Expand Up @@ -99,6 +100,8 @@ public interface CommandContext {

public DOIDataCiteServiceBean doiDataCite();

public FakePidProviderServiceBean fakePidProvider();

public HandlenetServiceBean handleNet();

public GuestbookServiceBean guestbooks();
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,7 @@
import static java.util.stream.Collectors.joining;
import javax.validation.ConstraintViolation;
import edu.harvard.iq.dataverse.GlobalIdServiceBean;
import edu.harvard.iq.dataverse.pidproviders.FakePidProviderServiceBean;

/**
*
Expand Down Expand Up @@ -158,6 +159,16 @@ protected void registerExternalIdentifier(Dataset theDataset, CommandContext ctx
if (!theDataset.isIdentifierRegistered()) {
GlobalIdServiceBean globalIdServiceBean = GlobalIdServiceBean.getBean(theDataset.getProtocol(), ctxt);
if ( globalIdServiceBean != null ) {
if (globalIdServiceBean instanceof FakePidProviderServiceBean) {
try {
globalIdServiceBean.createIdentifier(theDataset);
} catch (Throwable ex) {
logger.warning("Problem running createIdentifier for FakePidProvider: " + ex);
}
theDataset.setGlobalIdCreateTime(getTimestamp());
theDataset.setIdentifierRegistered(true);
return;
}
try {
if (globalIdServiceBean.alreadyExists(theDataset)) {
int attempts = 0;
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,66 @@
package edu.harvard.iq.dataverse.pidproviders;

import edu.harvard.iq.dataverse.AbstractGlobalIdServiceBean;
import edu.harvard.iq.dataverse.DvObject;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import javax.ejb.Stateless;

@Stateless
public class FakePidProviderServiceBean extends AbstractGlobalIdServiceBean {

@Override
public boolean alreadyExists(DvObject dvo) throws Exception {
return true;
}

@Override
public boolean registerWhenPublished() {
return false;
}

@Override
public List<String> getProviderInformation() {
ArrayList<String> providerInfo = new ArrayList<>();
String providerName = "FAKE";
String providerLink = "http://dataverse.org";
providerInfo.add(providerName);
providerInfo.add(providerLink);
return providerInfo;
}

@Override
public String createIdentifier(DvObject dvo) throws Throwable {
return "fakeIdentifier";
}

@Override
public Map<String, String> getIdentifierMetadata(DvObject dvo) {
Map<String, String> map = new HashMap<>();
return map;
}

@Override
public String modifyIdentifierTargetURL(DvObject dvo) throws Exception {
return "fakeModifyIdentifierTargetURL";
}

@Override
public void deleteIdentifier(DvObject dvo) throws Exception {
// no-op
}

@Override
public Map<String, String> lookupMetadataFromIdentifier(String protocol, String authority, String identifier) {
Map<String, String> map = new HashMap<>();
return map;
}

@Override
public boolean publicizeIdentifier(DvObject studyIn) {
return true;
}

}
2 changes: 0 additions & 2 deletions src/main/webapp/403.xhtml
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,6 @@
xmlns:ui="http://java.sun.com/jsf/facelets"
xmlns:o="http://omnifaces.org/ui"
xmlns:p="http://primefaces.org/ui">
<f:view locale="#{dataverseLocaleBean.localeCode}">
<h:head>
</h:head>

Expand Down Expand Up @@ -34,5 +33,4 @@
</ui:define>
</ui:composition>
</h:body>
</f:view>
</html>
2 changes: 0 additions & 2 deletions src/main/webapp/404.xhtml
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,6 @@
xmlns:p="http://primefaces.org/ui"
xmlns:o="http://omnifaces.org/ui"
xmlns:jsf="http://xmlns.jcp.org/jsf">
<f:view locale="#{dataverseLocaleBean.localeCode}">
<h:head>
</h:head>

Expand Down Expand Up @@ -44,5 +43,4 @@
</ui:define>
</ui:composition>
</h:body>
</f:view>
</html>
2 changes: 0 additions & 2 deletions src/main/webapp/500.xhtml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,6 @@
xmlns:f="http://java.sun.com/jsf/core"
xmlns:ui="http://java.sun.com/jsf/facelets"
xmlns:p="http://primefaces.org/ui">
<f:view locale="#{dataverseLocaleBean.localeCode}">
<h:head>
</h:head>

Expand All @@ -21,5 +20,4 @@
</ui:define>
</ui:composition>
</h:body>
</f:view>
</html>
2 changes: 0 additions & 2 deletions src/main/webapp/ThemeAndWidgets.xhtml
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,6 @@
xmlns:p="http://primefaces.org/ui"
xmlns:c="http://xmlns.jcp.org/jsp/jstl/core"
xmlns:jsf="http://xmlns.jcp.org/jsf">
<f:view locale="#{dataverseLocaleBean.localeCode}">
<h:head>
</h:head>

Expand All @@ -28,5 +27,4 @@
</ui:define>
</ui:composition>
</h:body>
</f:view>
</html>
2 changes: 0 additions & 2 deletions src/main/webapp/confirmemail.xhtml
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,6 @@
xmlns:ui="http://java.sun.com/jsf/facelets"
xmlns:p="http://primefaces.org/ui"
xmlns:jsf="http://xmlns.jcp.org/jsf">
<f:view locale="#{dataverseLocaleBean.localeCode}">
<h:head>
</h:head>

Expand All @@ -29,5 +28,4 @@
</ui:define>
</ui:composition>
</h:body>
</f:view>
</html>
2 changes: 0 additions & 2 deletions src/main/webapp/dashboard-users.xhtml
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,6 @@
xmlns:jsf="http://xmlns.jcp.org/jsf"
xmlns:p="http://primefaces.org/ui"
xmlns:c="http://xmlns.jcp.org/jsp/jstl/core">
<f:view locale="#{dataverseLocaleBean.localeCode}">
<h:head>
</h:head>

Expand Down Expand Up @@ -115,5 +114,4 @@
</ui:composition>

</h:body>
</f:view>
</html>
2 changes: 0 additions & 2 deletions src/main/webapp/dashboard.xhtml
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,6 @@
xmlns:ui="http://java.sun.com/jsf/facelets"
xmlns:jsf="http://xmlns.jcp.org/jsf"
xmlns:p="http://primefaces.org/ui">
<f:view locale="#{dataverseLocaleBean.localeCode}">
<h:head>
</h:head>

Expand Down Expand Up @@ -136,5 +135,4 @@
</ui:define>
</ui:composition>
</h:body>
</f:view>
</html>
Loading

0 comments on commit 6f64acf

Please sign in to comment.