Skip to content

Commit

Permalink
Merge branch '4.0.1'
Browse files Browse the repository at this point in the history
  • Loading branch information
pdurbin committed Jun 8, 2015
2 parents d301f2b + 4b92ae5 commit ba8c0b3
Show file tree
Hide file tree
Showing 91 changed files with 2,434 additions and 1,373 deletions.
2 changes: 1 addition & 1 deletion conf/solr/4.6.0/schema.xml
Original file line number Diff line number Diff line change
Expand Up @@ -281,7 +281,7 @@
<!-- Added for Dataverse 4.0 alpha 1: static "parent" fields not copied to "catchall" field https://redmine.hmdc.harvard.edu/issues/3603 -->
<!-- We index parentid and parentname as a debugging aid in case we want to match on it with an explict query like curl 'http://localhost:8983/solr/collection1/select?rows=100&wt=json&indent=true&q=parentid%3A42' or curl 'http://localhost:8983/solr/collection1/select?rows=100&wt=json&indent=true&q=parentname%3Abirds' -->
<!-- TODO: store parentid as a long instead of a string -->
<field name="parentId" type="string" stored="true" indexed="false" multiValued="false"/>
<field name="parentId" type="string" stored="true" indexed="true" multiValued="false"/>
<field name="parentIdentifier" type="string" stored="true" indexed="false" multiValued="false"/>
<field name="parentName" type="string" stored="true" indexed="false" multiValued="false"/>
<field name="parentCitation" type="string" stored="true" indexed="false" multiValued="false"/>
Expand Down
Binary file added doc/sphinx-guides/source/_static/logo.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
5 changes: 2 additions & 3 deletions doc/sphinx-guides/source/api/sword.rst
Original file line number Diff line number Diff line change
Expand Up @@ -55,7 +55,7 @@ New features as of v1.1

- "Contact E-mail" is automatically populated from dataset owners email.

- "Subject" is automatically populated with "N/A".
- "Subject" uses our controlled vocabulary list of subjects. This list is in the Citation Metadata of our User Guide > `Metadata References <http://guides.dataverse.org/en/latest/user/appendix.html#metadata-references>`_. Otherwise, if a term does not match our controlled vocabulary list, it will put any subject terms in "Keyword". If Subject is empty it is automatically populated with "N/A".

- Zero-length files are now allowed (but not necessarily encouraged).

Expand All @@ -82,15 +82,14 @@ Example Atom entry (XML)

Dublin Core Terms (DC Terms) Qualified Mapping - Dataverse DB Element Crosswalk
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

+-----------------------------+----------------------------------------------+--------------+-------------------------------------------------------------------------------------------------------------------------------------------------------------+
|DC (terms: namespace) | Dataverse DB Element | Required | Note |
+=============================+==============================================+==============+=============================================================================================================================================================+
|dcterms:title | title | Y | Title of the Dataset. |
+-----------------------------+----------------------------------------------+--------------+-------------------------------------------------------------------------------------------------------------------------------------------------------------+
|dcterms:creator | authorName (LastName, FirstName) | Y | Author(s) for the Dataset. |
+-----------------------------+----------------------------------------------+--------------+-------------------------------------------------------------------------------------------------------------------------------------------------------------+
|dcterms:subject | subject (Controlled Vocabulary) OR keyword | Y | Controlled Vocabulary list is in our User Guide > `Metadata References <../user/appendix.html#metadata-references>`_. |
|dcterms:subject | subject (Controlled Vocabulary) OR keyword | Y | Controlled Vocabulary list is in our User Guide > `Metadata References <http://guides.dataverse.org/en/latest/user/appendix.html#metadata-references>`_. |
+-----------------------------+----------------------------------------------+--------------+-------------------------------------------------------------------------------------------------------------------------------------------------------------+
|dcterms:description | dsDescriptionValue | Y | Describing the purpose, scope or nature of the Dataset. Can also use dcterms:abstract. |
+-----------------------------+----------------------------------------------+--------------+-------------------------------------------------------------------------------------------------------------------------------------------------------------+
Expand Down
6 changes: 3 additions & 3 deletions doc/sphinx-guides/source/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -63,9 +63,9 @@
# built documents.
#
# The short X.Y version.
version = '4.0'
version = '4.0.1'
# The full version, including alpha/beta/rc tags.
release = '4.0'
release = '4.0.1'

# The language for content autogenerated by Sphinx. Refer to documentation
# for a list of supported languages.
Expand Down Expand Up @@ -224,7 +224,7 @@

# If not '', a 'Last updated on:' timestamp is inserted at every page bottom,
# using the given strftime format.
#html_last_updated_fmt = '%b %d, %Y'
html_last_updated_fmt = '%b %d, %Y'

# If true, SmartyPants will be used to convert quotes and dashes to
# typographically correct entities.
Expand Down
62 changes: 62 additions & 0 deletions doc/sphinx-guides/source/developers/documentation.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,62 @@
=============
Documentation
=============

Quick Fix
-----------

If you find a typo or a small error in the documentation you can easily fix it using GitHub.

- Fork the repository
- Go to [your GitHub username]/dataverse/doc/sphinx-guides/source and access the file you would like to fix
- Click the Edit button in the upper-right corner and fix the error
- Submit a pull request

Other Changes (Sphinx)
----------------------

The documentation for Dataverse was written using Sphinx (http://sphinx-doc.org/).
If you are interested in suggesting changes or updates we recommend that you create
the html files using Sphinx locally and the submit a pull request through GitHub. Here are the instructions on how to proceed:


Installing Sphinx
~~~~~~~~~~~~~~~~~

On a Mac:

Download the sphinx zip file from http://sphinx-doc.org/install.html

Unzip it somewhere. In the unzipped directory, do the following as
root, (sudo -i):

python setup.py build
python setup.py install

Alternative option (Mac/Unix/Windows):

Unless you already have it, install pip (https://pip.pypa.io/en/latest/installing.html)

run ``pip install sphinx`` in a terminal



This is all you need. You should now be able to build HTML/pdf documentation from git sources locally.

Using Sphinx
~~~~~~~~~~~~

First, you will need to make a fork of the dataverse repository in GitHub. Then, you will need to make a clone of your fork so you can manipulate the files outside GitHub.

To edit the existing documentation go to ~/dataverse/doc/sphinx-guides/source directory inside your clone. There, you will find the .rst files that correspond to the guides in the dataverse page (http://guides.dataverse.org/en/latest/user/index.html). Now, using your preferred text editor, open and edit these files, or create new .rst files and edit the others accordingly.

Once you are done, open a terminal and change directories to ~/dataverse/doc/sphinx-guides . Then, run the following commands:

- ``make clean``

- ``make html Makefile``

After sphinx is done processing the files you should notice that the html folder in ~/dataverse/doc/sphinx-guides/build directory has been updated.
You can click on the files in the html folder to preview the changes.

Now you can make a commit with the changes to your own fork in GitHub and submit a pull request to the dataverse repository.
1 change: 1 addition & 0 deletions doc/sphinx-guides/source/developers/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,7 @@ Contents:
dev-environment
branching-strategy
testing
documentation
coding-style
making-releases
tools
Expand Down
6 changes: 3 additions & 3 deletions doc/sphinx-guides/source/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,10 +3,10 @@
You can adapt this file completely to your liking, but it should at least
contain the root `toctree` directive.
Dataverse 4.0 Guides
=======================================================
Dataverse 4.0.1 Guides
======================

These guides are for the most recent version of Dataverse (v4.0). For the guides for **version 3.6.2** please go `here <http://guides.dataverse.org/en/3.6.2/>`_.
These guides are for the most recent version of Dataverse. For the guides for **version 4.0** please go `here <http://guides.dataverse.org/en/4.0/>`_.

.. toctree::
:glob:
Expand Down
20 changes: 20 additions & 0 deletions doc/sphinx-guides/source/installation/administration.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
Administration
==============

.. contents:: :local:

User Administration
-------------------

Deleting an API token
~~~~~~~~~~~~~~~~~~~~~

If an API token is compromised it should be deleted. Users will be able to do this themselves once https://github.com/IQSS/dataverse/issues/1098 is complete but until then someone with access to the database must do it.

Using the API token 7ae33670-be21-491d-a244-008149856437 as an example:

``delete from apitoken where tokenstring = '7ae33670-be21-491d-a244-008149856437';``

You should expect the output ``DELETE 1`` after issuing the command above.

After the API token has been deleted, users can generate a new one per :doc:`/user/account`.
1 change: 1 addition & 0 deletions doc/sphinx-guides/source/installation/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -17,3 +17,4 @@ Contents:
installation-main
r-rapache-tworavens
shibboleth
administration
10 changes: 5 additions & 5 deletions doc/sphinx-guides/source/user/appendix.rst
Original file line number Diff line number Diff line change
Expand Up @@ -9,19 +9,19 @@ Metadata References
======================

Dataverse is committed to using standard-compliant metadata to ensure that Dataverse
metadata can be mapped easily to standard metadata schemas and be exported into XML/JSON
format for preservation and interoperability.
metadata can be mapped easily to standard metadata schemas and be exported into JSON
format (XML for tabular file metadata) for preservation and interoperability.

Detailed below are what metadata schemas we support for Citation and Domain Specific Metadata in Dataverse:

- `Citation Metadata <https://docs.google.com/spreadsheet/ccc?key=0AjeLxEN77UZodHFEWGpoa19ia3pldEFyVFR0aFVGa0E#gid=0>`__: compliant with `DDI Lite <http://www.ddialliance.org/specification/ddi2.1/lite/index.html>`_, `DDI 2.5 Codebook <http://www.ddialliance.org/>`__, `DataCite 3.1 <http://schema.datacite.org/meta/kernel-3.1/doc/DataCite-MetadataKernel_v3.1.pdf>`__, and Dublin Core's `DCMI Metadata Terms <http://dublincore.org/documents/dcmi-terms/>`__ (`see .tsv version <https://github.com/IQSS/dataverse/blob/master/scripts/api/data/metadatablocks/citation.tsv>`__). Language field uses `ISO 639-2 <http://www.loc.gov/standards/iso639-2/php/code_list.php>`__ controlled vocabulary.
- `Citation Metadata <https://docs.google.com/spreadsheet/ccc?key=0AjeLxEN77UZodHFEWGpoa19ia3pldEFyVFR0aFVGa0E#gid=0>`__: compliant with `DDI Lite <http://www.ddialliance.org/specification/ddi2.1/lite/index.html>`_, `DDI 2.5 Codebook <http://www.ddialliance.org/>`__, `DataCite 3.1 <http://schema.datacite.org/meta/kernel-3.1/doc/DataCite-MetadataKernel_v3.1.pdf>`__, and Dublin Core's `DCMI Metadata Terms <http://dublincore.org/documents/dcmi-terms/>`__ (`see .tsv version <https://github.com/IQSS/dataverse/blob/4.0.1/scripts/api/data/metadatablocks/citation.tsv>`__). Language field uses `ISO 639-2 <http://www.loc.gov/standards/iso639-2/php/code_list.php>`__ controlled vocabulary.
- `Geospatial Metadata <https://docs.google.com/spreadsheet/ccc?key=0AjeLxEN77UZodHFEWGpoa19ia3pldEFyVFR0aFVGa0E#gid=4>`__: compliant with DDI Lite, DDI 2.5 Codebook, DataCite, and Dublin Core (`see .tsv version <https://github.com/IQSS/dataverse/blob/master/scripts/api/data/metadatablocks/geospatial.tsv>`__). Country / Nation field uses `ISO 3166-1 <http://en.wikipedia.org/wiki/ISO_3166-1>`_ controlled vocabulary.
- `Social Science & Humanities Metadata <https://docs.google.com/spreadsheet/ccc?key=0AjeLxEN77UZodHFEWGpoa19ia3pldEFyVFR0aFVGa0E#gid=1>`__: compliant with DDI Lite, DDI 2.5 Codebook, and Dublin Core (`see .tsv version <https://github.com/IQSS/dataverse/blob/master/scripts/api/data/metadatablocks/social_science.tsv>`__).
- `Astronomy and Astrophysics Metadata <https://docs.google.com/spreadsheet/ccc?key=0AjeLxEN77UZodHFEWGpoa19ia3pldEFyVFR0aFVGa0E#gid=3>`__
: These metadata elements can be mapped/exported to the International Virtual Observatory Alliance’s (IVOA)
`VOResource Schema format <http://www.ivoa.net/documents/latest/RM.html>`__ and is based on
`Virtual Observatory (VO) Discovery and Provenance Metadata <http://www.wf4ever-project.org/wiki/download/attachments/1179927/DPmetadata.pdf?version=1&modificationDate=1337186963000>`__ (`see .tsv version <https://github.com/IQSS/dataverse/blob/master/scripts/api/data/metadatablocks/astrophysics.tsv>`__).
- `Life Sciences Metadata <https://docs.google.com/spreadsheet/ccc?key=0AjeLxEN77UZodHFEWGpoa19ia3pldEFyVFR0aFVGa0E#gid=2>`__: based on `ISA-Tab Specification <http://isatab.sourceforge.net/format.html>`__, along with controlled vocabulary from subsets of the `OBI Ontology <http://bioportal.bioontology.org/ontologies/OBI>`__ and the `NCBI Taxonomy for Organisms <http://www.ncbi.nlm.nih.gov/Taxonomy/taxonomyhome.html/>`__ (`see .tsv version <https://github.com/IQSS/dataverse/blob/master/scripts/api/data/metadatablocks/biomedical.tsv>`__).
`Virtual Observatory (VO) Discovery and Provenance Metadata <http://perma.cc/H5ZJ-4KKY>`__ (`see .tsv version <https://github.com/IQSS/dataverse/blob/master/scripts/api/data/metadatablocks/astrophysics.tsv>`__).
- `Life Sciences Metadata <https://docs.google.com/spreadsheet/ccc?key=0AjeLxEN77UZodHFEWGpoa19ia3pldEFyVFR0aFVGa0E#gid=2>`__: based on `ISA-Tab Specification <http://isatab.sourceforge.net/format.html>`__, along with controlled vocabulary from subsets of the `OBI Ontology <http://bioportal.bioontology.org/ontologies/OBI>`__ and the `NCBI Taxonomy for Organisms <http://www.ncbi.nlm.nih.gov/Taxonomy/taxonomyhome.html/>`__ (`see .tsv version <https://github.com/IQSS/dataverse/blob/4.0.1/scripts/api/data/metadatablocks/biomedical.tsv>`__).



Expand Down
6 changes: 4 additions & 2 deletions doc/sphinx-guides/source/user/dataset-management.rst
Original file line number Diff line number Diff line change
Expand Up @@ -55,12 +55,14 @@ Metadata found in the header section of `Flexible Image Transport System (FITS)
aggregated and displayed in the Astronomy Domain-Specific Metadata of the Dataset that the file belongs to. This FITS file metadata, is therefore searchable
and browsable (facets) at the Dataset-level.

Compressed Files: tar & zip
Compressed Files
----------------------------------------

Compressed files in tar and zip format are unpacked automatically. If it fails to unpack, for whatever reason, it will upload as
Compressed files in zip format are unpacked automatically. If it fails to unpack, for whatever reason, it will upload as
is. If the number of files inside are more than a set limit (1,000), you will get an error message and the file will uploads as is.

Support for unpacking tar files will be added when this ticket is closed: https://github.com/IQSS/dataverse/issues/2195.

Advanced Options
---------------------------------------------
There are several advanced options available for certain file types.
Expand Down
Loading

0 comments on commit ba8c0b3

Please sign in to comment.