https://snicdocs.nsc.liu.se/w/api.php?action=feedcontributions&user=Tom+Langborg+%28NSC%29&feedformat=atomSNIC Documentation - User contributions [en]2024-03-29T09:45:22ZUser contributionsMediaWiki 1.31.10https://snicdocs.nsc.liu.se/w/index.php?title=Apply_for_storage_on_Swestore&diff=6012Apply for storage on Swestore2014-12-17T15:54:23Z<p>Tom Langborg (NSC): </p>
<hr />
<div>[[Category:SweStore]]<br />
[[Category:SweStore user guide]]<br />
[[SweStore|< SweStore]]<br />
<br />
The SweStore nationally accessible storage is available for researchers financed by VR (which includes all researchers using SNIC compute resources) and FORMA.<br><br><br />
SweStore currently has a lack of resources. New projects will not be allocated until spring 2015. Existing projects on SweStore that are in need of more storage capacity are strongly encouraged to review their present data to see whether there is a potential to compress or delete data. In case you have any questions, please contact the SNIC office (office@snic.se).<br />
<br />
= Research Communities =<br />
<br />
SweStore is also in collaboration with [http://www.ecds.se/ ECDS], [http://snd.gu.se/ SND], [http://www.bioimaging.se/swedish_bioimaging_network/Welcome.html Bioimage], [http://www.bils.se/ BILS], [http://www.uppnex.uu.se/ UPPNEX],[http://wlcg.web.cern.ch/ WLCG], [http://www.nrm.se/ Naturhistoriska Riksmuseet]. If any of these cover your research area, first read their information on applying for SweStore storage.<br />
<br />
== UPPNEX ==<br />
If you are a member of an UPPNEX project on UPPMAX, an iRODS service is already available and configured for you, please see http://www.uppmax.uu.se/faq/how-to-move-files-to-swestore-using-irods for more information on how to use it.<br />
<br />
== Other communities ==<br />
Unless you are instructed otherwise, submit an application to SweStore as outlined below.<br />
<br />
= Application instructions =<br />
<br />
Application for Swestore need more fields now. Soon Swestore application will be moving to SUPR System. Once it is moved, these fields can be filled in SUPR instead.<br />
<br />
Send an email to [mailto:support@swestore.se support@swestore.se] <br />
<br />
Please include the following information in the application:<br><br />
* Whether you want the regular/production SweStore storage (based on dCache) or the new iRODS based storage (currently in pre-production pilot phase).<br />
** If you apply for iRODS-storage: please provide a shipping address to where your yubikey should be sent.<br />
<br />
* Principal Investigator (PI) - (Account Responsible)<br />
** Name<br />
** Job title/position<br />
** E-mail address<br />
** Phone no<br />
** Citizenship<br />
** Department<br />
** Organization<br />
** Address<br />
** Postal Code<br />
** City<br />
<br />
* Technical Administrator (the Primary contact)<br />
** Name<br />
** Job title/position<br />
** E-mail address<br />
** Phone no<br />
** Citizenship<br />
** Department<br />
** Organization<br />
** Address<br />
** Postal Code<br />
** City<br />
<br />
* Project Classification code according to VR classification codes http://classification.vr.se<br />
* Associated SNIC compute resource allocation (if any)<br />
* Project Summary and Purpose for the storage (abstract): A short description of the project and type of data. (Max 250 words)<br />
* Funding agency<br />
* Describe how the data is produced (ex:- computer simulation, experimentation, observation, sensors, digitization of data)<br />
* Describe by whom the data is produced, especially in case this is not within project itself<br />
* Required storage capacity: Preferably a maximum size, but if this is not currently determinable, please calculate a starting size and expansion by time period. <br />
** '''NOTE''' that applications larger than 10TB takes longer to process as they go through SNIC approval.<br />
* Please indicate the expected total storage volume (in TeraBytes) per year and how this is spread across storage media<br />
<br />
{| class="wikitable"<br />
|-<br />
! Type<br />
! 2014<br />
! 2015<br />
! 2016<br />
! 2017<br />
! 2018<br />
|-<br />
| Online,Ready-to-use Data<br />
| TBytes<br />
| TBytes<br />
| TBytes<br />
| TBytes<br />
| TBytes<br />
|-<br />
| Backup,Longer-Latency Data<br />
| TBytes<br />
| TBytes<br />
| TBytes<br />
| TBytes<br />
| TBytes<br />
|}<br />
<br />
* Describe how frequently the data will be uploaded to Swestore (once, rarely, continuously)<br />
* Describe which communities should be able to access the data (ex:- a set of people, VO, part of VO, publicly available)<br />
* Describe what kind of access the users should have to the data (read only, read and write)<br />
* Suggested project title: This should be a short descriptive human readable name<br />
* Suggested directory name: This will be used as root directory name for your storage<br />
** '''NOTE''' that this name is long-lived and will persist. It is not coupled to the lifetime of SNIC compute time allocations.<br />
** We recommend a project name not tied to a person.<br />
** Additionally, we recommend that the name is not a common word or a term easily confusable with other current or future research efforts.<br />
** It is a good idea to select a name that's short and easy to type.<br />
** The name is limited to lower-case letters a-z, digits 0-9, hyphens - and underscores _.</div>Tom Langborg (NSC)https://snicdocs.nsc.liu.se/w/index.php?title=Swestore-dCache&diff=6011Swestore-dCache2014-12-17T15:53:09Z<p>Tom Langborg (NSC): /* Getting access */</p>
<hr />
<div>[[Category:Storage]]<br />
[[Category:SweStore]]<br />
SNIC is building a storage infrastructure to complement the computational resources.<br />
<br />
Many forms of automated measurements can produce large amounts of data. In scientific areas such as high energy physics (the Large Hadron Collider at CERN), climate modeling, bioinformatics, bioimaging etc., the demands for storage are increasing dramatically. To serve these and other user communities, SNIC has appointed a working group to design a storage strategy, taking into account the needs on many levels and creating a unified storage infrastructure, which is now being implemented.<br />
<br />
Swestore is in collaboration with [http://www.ecds.se/ ECDS], [http://snd.gu.se/ SND], [http://www.bioimaging.se/swedish_bioimaging_network/Welcome.html Bioimage Sweden], [http://www.bils.se/ BILS], [http://www.uppnex.uu.se/ UPPNEX],[http://wlcg.web.cern.ch/ WLCG], [http://www.nrm.se/ NaturHistoriska RiksMuseet].<br />
<br />
= National storage "SweStore"=<br />
The Swestore Nationally Accessible Storage, commonly called just Swestore, is a robust, flexible and expandable long term storage system aimed at storing large amounts of data produced by various Swedish research projects. It is based on the [http://www.dcache.org dCache] and [http://www.irods.org iRODS]<br />
storage systems.<br />
<br />
Swestore is distributed across the SNIC centres [http://www.c3se.chalmers.se/ C3SE], [http://www.hpc2n.umu.se/ HPC2N], [http://www.lunarc.lu.se/ Lunarc], [http://www.nsc.liu.se/ NSC], [http://www.pdc.kth.se PDC] and [http://www.uppmax.uu.se Uppmax]. Data is stored in two copies with each copy at a different SNIC centre. This enables the system to cope with a multitude of issues ranging from a simple crash of a storage element to losing an entire site while still providing access to the stored data. <br />
<br />
One of the major advantages to the distributed nature of dCache and iRODS ([[Swestore-irods]]) is the excellent aggregated transfer rates possible. This is achieved by bypassing a central node and having transfers going directly to/from the storage elements if the protocol allows it. The Swestore Nationally Accessible Storage system can achieve aggregated transfer rates in excess of 100 Gigabit per second, but in practice this is limited by connectivity to each University (usually 10 Gbit/s) or a limited number of files (typically<br />
max 1 Gbit/s per file/connection).<br />
<br />
To protect against silent data corruption the dCache storage system checksums all stored data and periodically verifies the data using this checksum.<br />
<br />
The dCache system does NOT yet provide protection against user errors like inadvertent file deletions and so on. The [[Swestore-irods]] system provides this protection. Deleted files are moved to a trashcan.<br />
<br />
== Getting access ==<br />
<br />
SweStore currently has a lack of resources. New projects will not be allocated until spring 2015. Existing projects on SweStore that are in need of more storage capacity are strongly encouraged to review their present data to see whether there is a potential to compress or delete data. In case you have any questions, please contact the SNIC office (office@snic.se).<br />
<br />
; Apply for storage<br />
: Please follow the instructions on the [[Apply for storage on SweStore]] page.<br />
: iRODS is in <span style="color:#FF0000"> pilot phase </span><br><br />
: The iRODS system dosen't have the uptime and performance that our production system have.<br />
: We are still working with iRODS to get it into production.<br />
<br />
;Difference between dCache and iRODS user authentication<br />
:SweStore's dCache system uses eScience client certificates.<br />
:SweStore's iRODS system uses [http://www.yubico.com/products/yubikey-hardware/yubikey/ Yubikey] one-time passwords (OTP). With a simple touch of a button, a 44 character one-time password is generated and sent to the system. The user will be provided with a SweStore yubikey.<br />
:Yubikey has a status as pilot now. It can be changed in the future.<br />
<br />
; dCache usage - How to acquire an eScience client certificate<br />
: Follow the instructions on [[Grid_certificates#Requesting_a_certificate|Requesting a certificate]] to get your client certificate. This step can be performed while waiting for the storage application to be approved and processed. Of course, if you already have a valid eScience certificate you don't need to acquire another one.<br />
:; For Terena certificates<br />
:: If intending to access SweStore from a SNIC resource, please make sure you also [[Exporting_a_client_certificate|export the certificate]], transfer it to the intended SNIC resource and [[Preparing_a_client_certificate|prepare it for use with grid tools]] (not necessarily needed with ARC 3.x, see [[Grid_certificates#Creating_a_proxy_certificate_using_the_Firefox.2FThunderbird_credential_store|proxy certificates using Firefox credential store]]).<br />
:; For Nordugrid certificates<br />
:: Please make sure to also [[Requesting_a_grid_certificate_from_the_Nordugrid_CA#Installing_the_certificate_in_your_browser|install your client certificate in your browser]].<br />
:; Request membership in the SweGrid VO<br />
:: Follow the instructions on [[Grid_certificates#Requesting_membership_in_the_SweGrid_VO|Requesting membership in the SweGrid VO]] to get added to the SweGrid Virtual Organisation (VO) and request membership to your allocated storage project.<br />
<br />
; iRODS usage - How to acquire a SweStore yubikey<br />
:Please send an email to [mailto:support@swestore.se?subject=Yubikey support@swestore.se] and provide the shipping address to where the yubikey should be sent.<br><br />
:Yubikey has a status as pilot now. It can be changed in the future.<br />
<br />
== Support == <br />
<br />
If you have any issues using SweStore please do not hesitate to contact [mailto:support@swestore.se support@swestore.se].<br />
<br />
== dCache ==<br />
<br />
=== Access protocols ===<br />
; Currently supported protocols<br />
: GridFTP - gsiftp://gsiftp.swestore.se/<br />
: Storage Resource Manager - srm://srm.swegrid.se/<br />
: Hypertext Transfer Protocol (read-only), Web Distributed Authoring and Versioning - http://webdav.swestore.se/ (unauthenticated), https://webdav.swestore.se/<br />
: NFS4.1<br />
<br />
For authentication eScience certificates are used, which provides a higher level of security than legacy username/password schemes.<br />
<br />
=== Download and upload data ===<br />
; Interactive browsing and manipulation of single files<br />
: SweStore is accessible in your web browser in two ways, as a simple and reliable directory index interface at https://webdav.swestore.se/ and with a richer interactive file manager at https://webdav.swestore.se/browser/. '''Note''' that the interactive file manager has a lot of features and functions not supported in SweStore, only the basic file transfer features are supported.<br />
: To browse private data you need to have your certificate installed in your browser (default with Terena certificates, see above). Projects are organized under the <code>/snic</code> directory as <code><nowiki>https://webdav.swestore.se/snic/YOUR_PROJECT_NAME/</nowiki></code>.<br />
; Upload and delete data interactively or with automation<br />
There are several tools that are capable of using the protocols provided by SweStore national storage.<br />
For interactive usage on SNIC clusters we recommend using the ARC tools which should be installed on all SNIC resources.<br />
As an integration point for building scripts and automated systems we suggest using the curl program and library.<br />
: Use the ARC client. Please see the instructions for [[Accessing SweStore national storage with the ARC client]]. '''Recommended''' method when logged in on SNIC resources.<br />
: Use lftp. Please see the instructions for [[Accessing SweStore national storage with lftp]].<br />
: Use cURL. Please see the instructions for [[Accessing SweStore national storage with cURL]].<br />
: Use globus-url-copy. Please see the instructions for [[Accessing SweStore national storage with globus-url-copy]].<br />
<br />
=== Tools and scripts ===<br />
<br />
There exists a number of tools and utilities developed externally that can be useful. Here are some links:<br />
<br />
* [https://github.com/samuell/arc_tools ARC_Tools] - Convenience scripts for the arc client (Only a recursive rmdir so far).<br />
* [http://sourceforge.net/projects/arc-gui-clients ARC Graphical Clients] - Contains the ARC Storage Explorer (SweStore supported development).<br />
* Transfer script, [[SweStore/swetrans_arc|swetrans_arc]], provided by Adam Peplinski / Philipp Schlatter<br />
* [http://www.nordugrid.org/documents/SWIG-wrapped-ARC-Python-API.pdf Documentation of the ARC Python API (PDF)]<br />
<br />
=== Slides and more ===<br />
<br />
[http://docs.snic.se/wiki/Swestore/Lund_Seminar_Apr18 Slides and material from seminar for Lund users on April 18th]<br />
<br />
=== Usage monitoring ===<br />
* [http://status.swestore.se/munin/monitor/monitor/ Per Project Monitoring of Swestore usage]<br />
<br />
== iRODS ==<br />
<br />
Documentation of the SNIC iRODS system: [[Swestore-irods]].</div>Tom Langborg (NSC)https://snicdocs.nsc.liu.se/w/index.php?title=Apply_for_storage_on_Swestore&diff=6010Apply for storage on Swestore2014-12-15T14:30:33Z<p>Tom Langborg (NSC): </p>
<hr />
<div>[[Category:SweStore]]<br />
[[Category:SweStore user guide]]<br />
[[SweStore|< SweStore]]<br />
<br />
The SweStore nationally accessible storage is available for researchers financed by VR (which includes all researchers using SNIC compute resources) and FORMA.<br><br />
<span style="color:#FF0000"> SNIC has closed for new application until further notice.<br><br />
</span><br><br />
<br />
= Research Communities =<br />
<br />
SweStore is also in collaboration with [http://www.ecds.se/ ECDS], [http://snd.gu.se/ SND], [http://www.bioimaging.se/swedish_bioimaging_network/Welcome.html Bioimage], [http://www.bils.se/ BILS], [http://www.uppnex.uu.se/ UPPNEX],[http://wlcg.web.cern.ch/ WLCG], [http://www.nrm.se/ Naturhistoriska Riksmuseet]. If any of these cover your research area, first read their information on applying for SweStore storage.<br />
<br />
== UPPNEX ==<br />
If you are a member of an UPPNEX project on UPPMAX, an iRODS service is already available and configured for you, please see http://www.uppmax.uu.se/faq/how-to-move-files-to-swestore-using-irods for more information on how to use it.<br />
<br />
== Other communities ==<br />
Unless you are instructed otherwise, submit an application to SweStore as outlined below.<br />
<br />
= Application instructions =<br />
<br />
Application for Swestore need more fields now. Soon Swestore application will be moving to SUPR System. Once it is moved, these fields can be filled in SUPR instead.<br />
<br />
Send an email to [mailto:support@swestore.se support@swestore.se] <br />
<br />
Please include the following information in the application:<br><br />
* Whether you want the regular/production SweStore storage (based on dCache) or the new iRODS based storage (currently in pre-production pilot phase).<br />
** If you apply for iRODS-storage: please provide a shipping address to where your yubikey should be sent.<br />
<br />
* Principal Investigator (PI) - (Account Responsible)<br />
** Name<br />
** Job title/position<br />
** E-mail address<br />
** Phone no<br />
** Citizenship<br />
** Department<br />
** Organization<br />
** Address<br />
** Postal Code<br />
** City<br />
<br />
* Technical Administrator (the Primary contact)<br />
** Name<br />
** Job title/position<br />
** E-mail address<br />
** Phone no<br />
** Citizenship<br />
** Department<br />
** Organization<br />
** Address<br />
** Postal Code<br />
** City<br />
<br />
* Project Classification code according to VR classification codes http://classification.vr.se<br />
* Associated SNIC compute resource allocation (if any)<br />
* Project Summary and Purpose for the storage (abstract): A short description of the project and type of data. (Max 250 words)<br />
* Funding agency<br />
* Describe how the data is produced (ex:- computer simulation, experimentation, observation, sensors, digitization of data)<br />
* Describe by whom the data is produced, especially in case this is not within project itself<br />
* Required storage capacity: Preferably a maximum size, but if this is not currently determinable, please calculate a starting size and expansion by time period. <br />
** '''NOTE''' that applications larger than 10TB takes longer to process as they go through SNIC approval.<br />
* Please indicate the expected total storage volume (in TeraBytes) per year and how this is spread across storage media<br />
<br />
{| class="wikitable"<br />
|-<br />
! Type<br />
! 2014<br />
! 2015<br />
! 2016<br />
! 2017<br />
! 2018<br />
|-<br />
| Online,Ready-to-use Data<br />
| TBytes<br />
| TBytes<br />
| TBytes<br />
| TBytes<br />
| TBytes<br />
|-<br />
| Backup,Longer-Latency Data<br />
| TBytes<br />
| TBytes<br />
| TBytes<br />
| TBytes<br />
| TBytes<br />
|}<br />
<br />
* Describe how frequently the data will be uploaded to Swestore (once, rarely, continuously)<br />
* Describe which communities should be able to access the data (ex:- a set of people, VO, part of VO, publicly available)<br />
* Describe what kind of access the users should have to the data (read only, read and write)<br />
* Suggested project title: This should be a short descriptive human readable name<br />
* Suggested directory name: This will be used as root directory name for your storage<br />
** '''NOTE''' that this name is long-lived and will persist. It is not coupled to the lifetime of SNIC compute time allocations.<br />
** We recommend a project name not tied to a person.<br />
** Additionally, we recommend that the name is not a common word or a term easily confusable with other current or future research efforts.<br />
** It is a good idea to select a name that's short and easy to type.<br />
** The name is limited to lower-case letters a-z, digits 0-9, hyphens - and underscores _.</div>Tom Langborg (NSC)https://snicdocs.nsc.liu.se/w/index.php?title=Swestore-dCache&diff=6009Swestore-dCache2014-12-15T14:29:55Z<p>Tom Langborg (NSC): /* Getting access */</p>
<hr />
<div>[[Category:Storage]]<br />
[[Category:SweStore]]<br />
SNIC is building a storage infrastructure to complement the computational resources.<br />
<br />
Many forms of automated measurements can produce large amounts of data. In scientific areas such as high energy physics (the Large Hadron Collider at CERN), climate modeling, bioinformatics, bioimaging etc., the demands for storage are increasing dramatically. To serve these and other user communities, SNIC has appointed a working group to design a storage strategy, taking into account the needs on many levels and creating a unified storage infrastructure, which is now being implemented.<br />
<br />
Swestore is in collaboration with [http://www.ecds.se/ ECDS], [http://snd.gu.se/ SND], [http://www.bioimaging.se/swedish_bioimaging_network/Welcome.html Bioimage Sweden], [http://www.bils.se/ BILS], [http://www.uppnex.uu.se/ UPPNEX],[http://wlcg.web.cern.ch/ WLCG], [http://www.nrm.se/ NaturHistoriska RiksMuseet].<br />
<br />
= National storage "SweStore"=<br />
The Swestore Nationally Accessible Storage, commonly called just Swestore, is a robust, flexible and expandable long term storage system aimed at storing large amounts of data produced by various Swedish research projects. It is based on the [http://www.dcache.org dCache] and [http://www.irods.org iRODS]<br />
storage systems.<br />
<br />
Swestore is distributed across the SNIC centres [http://www.c3se.chalmers.se/ C3SE], [http://www.hpc2n.umu.se/ HPC2N], [http://www.lunarc.lu.se/ Lunarc], [http://www.nsc.liu.se/ NSC], [http://www.pdc.kth.se PDC] and [http://www.uppmax.uu.se Uppmax]. Data is stored in two copies with each copy at a different SNIC centre. This enables the system to cope with a multitude of issues ranging from a simple crash of a storage element to losing an entire site while still providing access to the stored data. <br />
<br />
One of the major advantages to the distributed nature of dCache and iRODS ([[Swestore-irods]]) is the excellent aggregated transfer rates possible. This is achieved by bypassing a central node and having transfers going directly to/from the storage elements if the protocol allows it. The Swestore Nationally Accessible Storage system can achieve aggregated transfer rates in excess of 100 Gigabit per second, but in practice this is limited by connectivity to each University (usually 10 Gbit/s) or a limited number of files (typically<br />
max 1 Gbit/s per file/connection).<br />
<br />
To protect against silent data corruption the dCache storage system checksums all stored data and periodically verifies the data using this checksum.<br />
<br />
The dCache system does NOT yet provide protection against user errors like inadvertent file deletions and so on. The [[Swestore-irods]] system provides this protection. Deleted files are moved to a trashcan.<br />
<br />
== Getting access ==<br />
<br />
<span style="color:#FF0000"> SNIC has closed for new application until further notice.<br><br />
</span><br><br />
<br />
; Apply for storage<br />
: Please follow the instructions on the [[Apply for storage on SweStore]] page.<br />
: iRODS is in <span style="color:#FF0000"> pilot phase </span><br><br />
: The iRODS system dosen't have the uptime and performance that our production system have.<br />
: We are still working with iRODS to get it into production.<br />
<br />
;Difference between dCache and iRODS user authentication<br />
:SweStore's dCache system uses eScience client certificates.<br />
:SweStore's iRODS system uses [http://www.yubico.com/products/yubikey-hardware/yubikey/ Yubikey] one-time passwords (OTP). With a simple touch of a button, a 44 character one-time password is generated and sent to the system. The user will be provided with a SweStore yubikey.<br />
:Yubikey has a status as pilot now. It can be changed in the future.<br />
<br />
; dCache usage - How to acquire an eScience client certificate<br />
: Follow the instructions on [[Grid_certificates#Requesting_a_certificate|Requesting a certificate]] to get your client certificate. This step can be performed while waiting for the storage application to be approved and processed. Of course, if you already have a valid eScience certificate you don't need to acquire another one.<br />
:; For Terena certificates<br />
:: If intending to access SweStore from a SNIC resource, please make sure you also [[Exporting_a_client_certificate|export the certificate]], transfer it to the intended SNIC resource and [[Preparing_a_client_certificate|prepare it for use with grid tools]] (not necessarily needed with ARC 3.x, see [[Grid_certificates#Creating_a_proxy_certificate_using_the_Firefox.2FThunderbird_credential_store|proxy certificates using Firefox credential store]]).<br />
:; For Nordugrid certificates<br />
:: Please make sure to also [[Requesting_a_grid_certificate_from_the_Nordugrid_CA#Installing_the_certificate_in_your_browser|install your client certificate in your browser]].<br />
:; Request membership in the SweGrid VO<br />
:: Follow the instructions on [[Grid_certificates#Requesting_membership_in_the_SweGrid_VO|Requesting membership in the SweGrid VO]] to get added to the SweGrid Virtual Organisation (VO) and request membership to your allocated storage project.<br />
<br />
; iRODS usage - How to acquire a SweStore yubikey<br />
:Please send an email to [mailto:support@swestore.se?subject=Yubikey support@swestore.se] and provide the shipping address to where the yubikey should be sent.<br><br />
:Yubikey has a status as pilot now. It can be changed in the future.<br />
<br />
== Support == <br />
<br />
If you have any issues using SweStore please do not hesitate to contact [mailto:support@swestore.se support@swestore.se].<br />
<br />
== dCache ==<br />
<br />
=== Access protocols ===<br />
; Currently supported protocols<br />
: GridFTP - gsiftp://gsiftp.swestore.se/<br />
: Storage Resource Manager - srm://srm.swegrid.se/<br />
: Hypertext Transfer Protocol (read-only), Web Distributed Authoring and Versioning - http://webdav.swestore.se/ (unauthenticated), https://webdav.swestore.se/<br />
: NFS4.1<br />
<br />
For authentication eScience certificates are used, which provides a higher level of security than legacy username/password schemes.<br />
<br />
=== Download and upload data ===<br />
; Interactive browsing and manipulation of single files<br />
: SweStore is accessible in your web browser in two ways, as a simple and reliable directory index interface at https://webdav.swestore.se/ and with a richer interactive file manager at https://webdav.swestore.se/browser/. '''Note''' that the interactive file manager has a lot of features and functions not supported in SweStore, only the basic file transfer features are supported.<br />
: To browse private data you need to have your certificate installed in your browser (default with Terena certificates, see above). Projects are organized under the <code>/snic</code> directory as <code><nowiki>https://webdav.swestore.se/snic/YOUR_PROJECT_NAME/</nowiki></code>.<br />
; Upload and delete data interactively or with automation<br />
There are several tools that are capable of using the protocols provided by SweStore national storage.<br />
For interactive usage on SNIC clusters we recommend using the ARC tools which should be installed on all SNIC resources.<br />
As an integration point for building scripts and automated systems we suggest using the curl program and library.<br />
: Use the ARC client. Please see the instructions for [[Accessing SweStore national storage with the ARC client]]. '''Recommended''' method when logged in on SNIC resources.<br />
: Use lftp. Please see the instructions for [[Accessing SweStore national storage with lftp]].<br />
: Use cURL. Please see the instructions for [[Accessing SweStore national storage with cURL]].<br />
: Use globus-url-copy. Please see the instructions for [[Accessing SweStore national storage with globus-url-copy]].<br />
<br />
=== Tools and scripts ===<br />
<br />
There exists a number of tools and utilities developed externally that can be useful. Here are some links:<br />
<br />
* [https://github.com/samuell/arc_tools ARC_Tools] - Convenience scripts for the arc client (Only a recursive rmdir so far).<br />
* [http://sourceforge.net/projects/arc-gui-clients ARC Graphical Clients] - Contains the ARC Storage Explorer (SweStore supported development).<br />
* Transfer script, [[SweStore/swetrans_arc|swetrans_arc]], provided by Adam Peplinski / Philipp Schlatter<br />
* [http://www.nordugrid.org/documents/SWIG-wrapped-ARC-Python-API.pdf Documentation of the ARC Python API (PDF)]<br />
<br />
=== Slides and more ===<br />
<br />
[http://docs.snic.se/wiki/Swestore/Lund_Seminar_Apr18 Slides and material from seminar for Lund users on April 18th]<br />
<br />
=== Usage monitoring ===<br />
* [http://status.swestore.se/munin/monitor/monitor/ Per Project Monitoring of Swestore usage]<br />
<br />
== iRODS ==<br />
<br />
Documentation of the SNIC iRODS system: [[Swestore-irods]].</div>Tom Langborg (NSC)https://snicdocs.nsc.liu.se/w/index.php?title=Apply_for_storage_on_Swestore&diff=6008Apply for storage on Swestore2014-12-15T09:41:40Z<p>Tom Langborg (NSC): </p>
<hr />
<div>[[Category:SweStore]]<br />
[[Category:SweStore user guide]]<br />
[[SweStore|< SweStore]]<br />
<br />
The SweStore nationally accessible storage is available for researchers financed by VR (which includes all researchers using SNIC compute resources) and FORMA.<br><br />
<span style="color:#FF0000"> SNIC has closed for new application until further notice.<br><br />
Due to no procurements has been approved under 2014.</span><br><br />
<br />
= Research Communities =<br />
<br />
SweStore is also in collaboration with [http://www.ecds.se/ ECDS], [http://snd.gu.se/ SND], [http://www.bioimaging.se/swedish_bioimaging_network/Welcome.html Bioimage], [http://www.bils.se/ BILS], [http://www.uppnex.uu.se/ UPPNEX],[http://wlcg.web.cern.ch/ WLCG], [http://www.nrm.se/ Naturhistoriska Riksmuseet]. If any of these cover your research area, first read their information on applying for SweStore storage.<br />
<br />
== UPPNEX ==<br />
If you are a member of an UPPNEX project on UPPMAX, an iRODS service is already available and configured for you, please see http://www.uppmax.uu.se/faq/how-to-move-files-to-swestore-using-irods for more information on how to use it.<br />
<br />
== Other communities ==<br />
Unless you are instructed otherwise, submit an application to SweStore as outlined below.<br />
<br />
= Application instructions =<br />
<br />
Application for Swestore need more fields now. Soon Swestore application will be moving to SUPR System. Once it is moved, these fields can be filled in SUPR instead.<br />
<br />
Send an email to [mailto:support@swestore.se support@swestore.se] <br />
<br />
Please include the following information in the application:<br><br />
* Whether you want the regular/production SweStore storage (based on dCache) or the new iRODS based storage (currently in pre-production pilot phase).<br />
** If you apply for iRODS-storage: please provide a shipping address to where your yubikey should be sent.<br />
<br />
* Principal Investigator (PI) - (Account Responsible)<br />
** Name<br />
** Job title/position<br />
** E-mail address<br />
** Phone no<br />
** Citizenship<br />
** Department<br />
** Organization<br />
** Address<br />
** Postal Code<br />
** City<br />
<br />
* Technical Administrator (the Primary contact)<br />
** Name<br />
** Job title/position<br />
** E-mail address<br />
** Phone no<br />
** Citizenship<br />
** Department<br />
** Organization<br />
** Address<br />
** Postal Code<br />
** City<br />
<br />
* Project Classification code according to VR classification codes http://classification.vr.se<br />
* Associated SNIC compute resource allocation (if any)<br />
* Project Summary and Purpose for the storage (abstract): A short description of the project and type of data. (Max 250 words)<br />
* Funding agency<br />
* Describe how the data is produced (ex:- computer simulation, experimentation, observation, sensors, digitization of data)<br />
* Describe by whom the data is produced, especially in case this is not within project itself<br />
* Required storage capacity: Preferably a maximum size, but if this is not currently determinable, please calculate a starting size and expansion by time period. <br />
** '''NOTE''' that applications larger than 10TB takes longer to process as they go through SNIC approval.<br />
* Please indicate the expected total storage volume (in TeraBytes) per year and how this is spread across storage media<br />
<br />
{| class="wikitable"<br />
|-<br />
! Type<br />
! 2014<br />
! 2015<br />
! 2016<br />
! 2017<br />
! 2018<br />
|-<br />
| Online,Ready-to-use Data<br />
| TBytes<br />
| TBytes<br />
| TBytes<br />
| TBytes<br />
| TBytes<br />
|-<br />
| Backup,Longer-Latency Data<br />
| TBytes<br />
| TBytes<br />
| TBytes<br />
| TBytes<br />
| TBytes<br />
|}<br />
<br />
* Describe how frequently the data will be uploaded to Swestore (once, rarely, continuously)<br />
* Describe which communities should be able to access the data (ex:- a set of people, VO, part of VO, publicly available)<br />
* Describe what kind of access the users should have to the data (read only, read and write)<br />
* Suggested project title: This should be a short descriptive human readable name<br />
* Suggested directory name: This will be used as root directory name for your storage<br />
** '''NOTE''' that this name is long-lived and will persist. It is not coupled to the lifetime of SNIC compute time allocations.<br />
** We recommend a project name not tied to a person.<br />
** Additionally, we recommend that the name is not a common word or a term easily confusable with other current or future research efforts.<br />
** It is a good idea to select a name that's short and easy to type.<br />
** The name is limited to lower-case letters a-z, digits 0-9, hyphens - and underscores _.</div>Tom Langborg (NSC)https://snicdocs.nsc.liu.se/w/index.php?title=Swestore-dCache&diff=6007Swestore-dCache2014-12-15T09:40:00Z<p>Tom Langborg (NSC): /* Getting access */</p>
<hr />
<div>[[Category:Storage]]<br />
[[Category:SweStore]]<br />
SNIC is building a storage infrastructure to complement the computational resources.<br />
<br />
Many forms of automated measurements can produce large amounts of data. In scientific areas such as high energy physics (the Large Hadron Collider at CERN), climate modeling, bioinformatics, bioimaging etc., the demands for storage are increasing dramatically. To serve these and other user communities, SNIC has appointed a working group to design a storage strategy, taking into account the needs on many levels and creating a unified storage infrastructure, which is now being implemented.<br />
<br />
Swestore is in collaboration with [http://www.ecds.se/ ECDS], [http://snd.gu.se/ SND], [http://www.bioimaging.se/swedish_bioimaging_network/Welcome.html Bioimage Sweden], [http://www.bils.se/ BILS], [http://www.uppnex.uu.se/ UPPNEX],[http://wlcg.web.cern.ch/ WLCG], [http://www.nrm.se/ NaturHistoriska RiksMuseet].<br />
<br />
= National storage "SweStore"=<br />
The Swestore Nationally Accessible Storage, commonly called just Swestore, is a robust, flexible and expandable long term storage system aimed at storing large amounts of data produced by various Swedish research projects. It is based on the [http://www.dcache.org dCache] and [http://www.irods.org iRODS]<br />
storage systems.<br />
<br />
Swestore is distributed across the SNIC centres [http://www.c3se.chalmers.se/ C3SE], [http://www.hpc2n.umu.se/ HPC2N], [http://www.lunarc.lu.se/ Lunarc], [http://www.nsc.liu.se/ NSC], [http://www.pdc.kth.se PDC] and [http://www.uppmax.uu.se Uppmax]. Data is stored in two copies with each copy at a different SNIC centre. This enables the system to cope with a multitude of issues ranging from a simple crash of a storage element to losing an entire site while still providing access to the stored data. <br />
<br />
One of the major advantages to the distributed nature of dCache and iRODS ([[Swestore-irods]]) is the excellent aggregated transfer rates possible. This is achieved by bypassing a central node and having transfers going directly to/from the storage elements if the protocol allows it. The Swestore Nationally Accessible Storage system can achieve aggregated transfer rates in excess of 100 Gigabit per second, but in practice this is limited by connectivity to each University (usually 10 Gbit/s) or a limited number of files (typically<br />
max 1 Gbit/s per file/connection).<br />
<br />
To protect against silent data corruption the dCache storage system checksums all stored data and periodically verifies the data using this checksum.<br />
<br />
The dCache system does NOT yet provide protection against user errors like inadvertent file deletions and so on. The [[Swestore-irods]] system provides this protection. Deleted files are moved to a trashcan.<br />
<br />
== Getting access ==<br />
<br />
<span style="color:#FF0000"> SNIC has closed for new application until further notice.<br><br />
Due to no procurements has been approved under 2014.</span><br><br />
<br />
; Apply for storage<br />
: Please follow the instructions on the [[Apply for storage on SweStore]] page.<br />
: iRODS is in <span style="color:#FF0000"> pilot phase </span><br><br />
: The iRODS system dosen't have the uptime and performance that our production system have.<br />
: We are still working with iRODS to get it into production.<br />
<br />
;Difference between dCache and iRODS user authentication<br />
:SweStore's dCache system uses eScience client certificates.<br />
:SweStore's iRODS system uses [http://www.yubico.com/products/yubikey-hardware/yubikey/ Yubikey] one-time passwords (OTP). With a simple touch of a button, a 44 character one-time password is generated and sent to the system. The user will be provided with a SweStore yubikey.<br />
:Yubikey has a status as pilot now. It can be changed in the future.<br />
<br />
; dCache usage - How to acquire an eScience client certificate<br />
: Follow the instructions on [[Grid_certificates#Requesting_a_certificate|Requesting a certificate]] to get your client certificate. This step can be performed while waiting for the storage application to be approved and processed. Of course, if you already have a valid eScience certificate you don't need to acquire another one.<br />
:; For Terena certificates<br />
:: If intending to access SweStore from a SNIC resource, please make sure you also [[Exporting_a_client_certificate|export the certificate]], transfer it to the intended SNIC resource and [[Preparing_a_client_certificate|prepare it for use with grid tools]] (not necessarily needed with ARC 3.x, see [[Grid_certificates#Creating_a_proxy_certificate_using_the_Firefox.2FThunderbird_credential_store|proxy certificates using Firefox credential store]]).<br />
:; For Nordugrid certificates<br />
:: Please make sure to also [[Requesting_a_grid_certificate_from_the_Nordugrid_CA#Installing_the_certificate_in_your_browser|install your client certificate in your browser]].<br />
:; Request membership in the SweGrid VO<br />
:: Follow the instructions on [[Grid_certificates#Requesting_membership_in_the_SweGrid_VO|Requesting membership in the SweGrid VO]] to get added to the SweGrid Virtual Organisation (VO) and request membership to your allocated storage project.<br />
<br />
; iRODS usage - How to acquire a SweStore yubikey<br />
:Please send an email to [mailto:support@swestore.se?subject=Yubikey support@swestore.se] and provide the shipping address to where the yubikey should be sent.<br><br />
:Yubikey has a status as pilot now. It can be changed in the future.<br />
<br />
== Support == <br />
<br />
If you have any issues using SweStore please do not hesitate to contact [mailto:support@swestore.se support@swestore.se].<br />
<br />
== dCache ==<br />
<br />
=== Access protocols ===<br />
; Currently supported protocols<br />
: GridFTP - gsiftp://gsiftp.swestore.se/<br />
: Storage Resource Manager - srm://srm.swegrid.se/<br />
: Hypertext Transfer Protocol (read-only), Web Distributed Authoring and Versioning - http://webdav.swestore.se/ (unauthenticated), https://webdav.swestore.se/<br />
: NFS4.1<br />
<br />
For authentication eScience certificates are used, which provides a higher level of security than legacy username/password schemes.<br />
<br />
=== Download and upload data ===<br />
; Interactive browsing and manipulation of single files<br />
: SweStore is accessible in your web browser in two ways, as a simple and reliable directory index interface at https://webdav.swestore.se/ and with a richer interactive file manager at https://webdav.swestore.se/browser/. '''Note''' that the interactive file manager has a lot of features and functions not supported in SweStore, only the basic file transfer features are supported.<br />
: To browse private data you need to have your certificate installed in your browser (default with Terena certificates, see above). Projects are organized under the <code>/snic</code> directory as <code><nowiki>https://webdav.swestore.se/snic/YOUR_PROJECT_NAME/</nowiki></code>.<br />
; Upload and delete data interactively or with automation<br />
There are several tools that are capable of using the protocols provided by SweStore national storage.<br />
For interactive usage on SNIC clusters we recommend using the ARC tools which should be installed on all SNIC resources.<br />
As an integration point for building scripts and automated systems we suggest using the curl program and library.<br />
: Use the ARC client. Please see the instructions for [[Accessing SweStore national storage with the ARC client]]. '''Recommended''' method when logged in on SNIC resources.<br />
: Use lftp. Please see the instructions for [[Accessing SweStore national storage with lftp]].<br />
: Use cURL. Please see the instructions for [[Accessing SweStore national storage with cURL]].<br />
: Use globus-url-copy. Please see the instructions for [[Accessing SweStore national storage with globus-url-copy]].<br />
<br />
=== Tools and scripts ===<br />
<br />
There exists a number of tools and utilities developed externally that can be useful. Here are some links:<br />
<br />
* [https://github.com/samuell/arc_tools ARC_Tools] - Convenience scripts for the arc client (Only a recursive rmdir so far).<br />
* [http://sourceforge.net/projects/arc-gui-clients ARC Graphical Clients] - Contains the ARC Storage Explorer (SweStore supported development).<br />
* Transfer script, [[SweStore/swetrans_arc|swetrans_arc]], provided by Adam Peplinski / Philipp Schlatter<br />
* [http://www.nordugrid.org/documents/SWIG-wrapped-ARC-Python-API.pdf Documentation of the ARC Python API (PDF)]<br />
<br />
=== Slides and more ===<br />
<br />
[http://docs.snic.se/wiki/Swestore/Lund_Seminar_Apr18 Slides and material from seminar for Lund users on April 18th]<br />
<br />
=== Usage monitoring ===<br />
* [http://status.swestore.se/munin/monitor/monitor/ Per Project Monitoring of Swestore usage]<br />
<br />
== iRODS ==<br />
<br />
Documentation of the SNIC iRODS system: [[Swestore-irods]].</div>Tom Langborg (NSC)https://snicdocs.nsc.liu.se/w/index.php?title=Swestore-irods&diff=5943Swestore-irods2014-10-16T08:09:57Z<p>Tom Langborg (NSC): /* Support */</p>
<hr />
<div>= National Storage using iRODS =<br />
[[Category:Storage]]<br />
[[Category:SweStore]]<br />
<br />
== Getting access ==<br />
; Apply for storage<br />
: Please follow the instructions on the [[Apply for storage on SweStore]] page.<br />
<br />
;iRODS user authentication<br />
:SweStore's iRODS system uses [http://www.yubico.com/products/yubikey-hardware/yubikey/ Yubikey] one-time passwords (OTP). With a simple touch of a button, a 44 character one-time password is generated and sent to the system. The user will be provided with a SweStore yubikey.<br />
:Yubikey has a status as pilot now. It can be changed in the future.<br />
<br />
; iRODS usage <span style="color:#FF0000"> Pilot. </span><br> - How to acquire a SweStore yubikey<br />
:Please send an email to [mailto:support@swestore.se?subject=Yubikey support@swestore.se] and provide the shipping address to where the yubikey should be sent.<br><br />
:Yubikey has a status as pilot now. It can be changed in the future.<br />
<br />
== Support == <br />
<br />
If you have any issues using SweStore please do not hesitate to contact [mailto:support@swestore.se support@swestore.se].<br />
<br />
<span style="color:#FF0000"> Better not to use filename with single quotes. (There were problems with these but they had been fixed.</span><br><br />
<br />
== Usage monitoring ==<br />
* [http://status.swestore.se/munin/monitor/monitor/ Per Project Monitoring of Swestore usage]<br />
<br />
== Supported clients ==<br />
<br />
: iDrop web - Point your Web browser to [https://iweb.swestore.se iweb.swestore.se]<br />
: E-iRODS iCommands 3.0 - Command line client [ftp://ftp.renci.org/pub/irods/releases/3.0.1 Download E-iRODS icommands]<br />
<br />
SweStore iRODS uses PAM authentication and SweStore yubikeys. With a simple touch of a button, a 44 character one-time password is generated and sent to the system.<br />
<br />
<br />
=== Web GUI (iDrop web) ===<br />
Please see the specific documentation for [[iDrop web]].<br />
<br />
=== Community iRODS version 3.3 ===<br />
The community iRODS client version 3.3 also should work, with PAM authentication.<br><br />
It is available from [http://irods.sdsc.edu/download.html SDSC].<br />
Please install the OpenSSL include files and libraries:<br />
<pre><br />
$ sudo apt-get install libssl-dev (debian based system)<br />
# yum install openssl-devel (redhat-based systems)<br />
</pre><br />
Download irods 3.3 from http://irods.sdsc.edu/download.html and unpack the tar.gz archive.<br />
<br />
Please enable the following defines in the Makefile iRODS/config/config.mk.in<br />
<pre><br />
PAM_AUTH = 1<br />
PAM_AUTH_NO_EXTEND = 1<br />
USE_SSL = 1 <br />
</pre><br />
Please run irodssetup to compile the irods community client with PAM authentication.<br />
<br />
== SweStore iRODS usage documentation ==<br />
<br />
To use the system you need to have the E-iRODS command line client installed or using iDROP web. <br />
<br />
=== Command line client ===<br />
<br />
For Linux systems the iRODS command line client is available as an installable package for various<br />
Linux platforms from the e-iRODS website downloads section.<br />
<br />
The command line client is natural to use for Unix users.<br />
There are versions of the usual ls, rm, mv, mkdir, pwd, rsync<br />
commands prefixed with an i for iRODS, i.e. irm, imv, imkdir etc.<br />
<br />
As expected iput and iget move files to and from the irods system.<br />
All these commands print short help when using the -h option.<br />
<br />
==== iCommands environment file ====<br />
<br />
There is an environment file .irodsEnv in the .irods subdirectory<br />
of the home directory ($HOME/.irods/.irodsEnv) which contains information where and how<br />
to access the iRODS metadata (iCAT) server.<br />
<br />
It looks like (placeholders are in <>):<br />
<pre><br />
irodsHost 'irods.swestore.se'<br />
irodsPort 1247<br />
irodsDefResource 'snicdefResc'<br />
irodsHome '/snicZone/proj/<PROJECT_NAME>'<br />
irodsCwd '/snicZone/proj/<PROJECT_NAME>'<br />
irodsUserName '<USERNAME>'<br />
irodsZone 'snicZone'<br />
irodsAuthScheme 'PAM'<br />
</pre><br />
<br />
The iCAT server is irods.swestore.se.<br />
The default irods zone name is snicZone.<br />
The default resource is snicdefResc.<br />
It is best to set the home directory to the same as the<br />
project directory, which would be a subdirectory under<br />
the /snicZone/proj directory tree.<br />
<br />
==== Yubikey instructions ====<br />
<br />
Prerequisite: A correct iCommands environment file, see above for instructions.<br />
<br />
# Insert the yubikey in an available USB-slot in your computer.<br />
# Type iinit<br />
# Touch the conductive surface on the yubikey to send an one-time password to the system. <br />
<br />
<pre><br />
<br />
$ iinit<br />
Enter your current PAM (system) password:<br />
$ ils<br />
/snicZone/proj/<projectname>:<br />
$<br />
</pre><br />
<br />
After that we can use the usual iCommands for 8 hours.<br />
<br />
More details on the iCommands are available at<br />
https://www.irods.org/index.php/icommands<br />
<br />
==== iCommands ====<br />
<br />
Having initialized the session as described above we can use tie iRODS versions<br />
of the basic Unix commands. The project directory is under /snicZone/proj, all<br />
members of the project should have write access to this directory. We can use<br />
the command<br />
<pre><br />
icd /snicZone/proj/projectname<br />
</pre><br />
to move to the project directory, or to change to an another project directory<br />
when we are members of more than one project.<br />
<br />
All commands give short help when invoked with the -h flag.<br />
<br />
To put files files into the iRODS system we can use:<br />
<pre><br />
iput localfile irodsfile<br />
</pre><br />
or, to put a whole directory tree:<br />
<pre><br />
iput -r localdirectory irodscollection<br />
</pre><br />
<br />
To load large amout of data it might be more advantageous to use<br />
<pre><br />
irsync -r localdirectory irodscollection<br />
</pre><br />
It might be a good idea to use -K so then checksums will be computed,<br />
stored and checked.<br />
<br />
To create directories (collections in iRODSspeak) we use:<br />
<pre><br />
imkdir collection<br />
</pre><br />
as it would be expected.<br />
<br />
To get those files back we can use<br />
<pre><br />
iget irodsfile localfile<br />
</pre><br />
or<br />
<pre><br />
irsync -r irodscollection localdirectory<br />
</pre><br />
<br />
To remove files we use:<br />
<pre><br />
irm<br />
</pre><br />
or<br />
<pre><br />
irm -r<br />
</pre><br />
<br />
Removing files like that would put the files into the trashcan (path: /snicZone/trash/).<br />
Time to time we would need to empty the trashcan, using<br />
<pre><br />
irmtrash<br />
</pre><br />
<br />
==== Using iCommands on SNIC HPC clusters ====<br />
<br />
On SNIC-clusters the icommands command line tools are either available in the PATH or by adding the irods module, e.g.<br />
: module load irods<br />
:If the irods commands are not available at the SNIC HPC cluster, please contact support@swestore.se<br />
We also need to setup the iCommands environment file $HOME/.irods/.irodsEnv<br />
<br />
=== Storage Project directory structure ===<br />
<br />
Your storage project is available at /snicZone/proj/<PROJECT NAME><br />
<br />
/snicZone/home/<USERNAME> is just a small home directory.<br />
<br />
=== iDROP web client ===<br />
<br />
See the [[iDrop web]] specific page.<br />
<br />
=== Upstream documentation ===<br />
Detailed documentation, papers and resources are available from<br />
the [http://www.eirods.org E-iRODS web site]<br />
<br />
[http://www.irods.org Community iRODS]<br />
<br />
[https://groups.google.com/d/forum/irod-chat User forum]</div>Tom Langborg (NSC)https://snicdocs.nsc.liu.se/w/index.php?title=Swestore-dCache&diff=5881Swestore-dCache2014-06-18T06:32:44Z<p>Tom Langborg (NSC): /* Getting access */</p>
<hr />
<div>[[Category:Storage]]<br />
[[Category:SweStore]]<br />
SNIC is building a storage infrastructure to complement the computational resources.<br />
<br />
Many forms of automated measurements can produce large amounts of data. In scientific areas such as high energy physics (the Large Hadron Collider at CERN), climate modeling, bioinformatics, bioimaging etc., the demands for storage are increasing dramatically. To serve these and other user communities, SNIC has appointed a working group to design a storage strategy, taking into account the needs on many levels and creating a unified storage infrastructure, which is now being implemented.<br />
<br />
Swestore is in collaboration with [http://www.ecds.se/ ECDS], [http://snd.gu.se/ SND], [http://www.bioimaging.se/swedish_bioimaging_network/Welcome.html Bioimage Sweden], [http://www.bils.se/ BILS], [http://www.uppnex.uu.se/ UPPNEX],[http://wlcg.web.cern.ch/ WLCG], [http://www.nrm.se/ NaturHistoriska RiksMuseet].<br />
<br />
= National storage "SweStore"=<br />
The Swestore Nationally Accessible Storage, commonly called just Swestore, is a robust, flexible and expandable long term storage system aimed at storing large amounts of data produced by various Swedish research projects. It is based on the [http://www.dcache.org dCache] and [http://www.irods.org iRODS]<br />
storage systems.<br />
<br />
Swestore is distributed across the SNIC centres [http://www.c3se.chalmers.se/ C3SE], [http://www.hpc2n.umu.se/ HPC2N], [http://www.lunarc.lu.se/ Lunarc], [http://www.nsc.liu.se/ NSC], [http://www.pdc.kth.se PDC] and [http://www.uppmax.uu.se Uppmax]. Data is stored in two copies with each copy at a different SNIC centre. This enables the system to cope with a multitude of issues ranging from a simple crash of a storage element to losing an entire site while still providing access to the stored data. <br />
<br />
One of the major advantages to the distributed nature of dCache and iRODS ([[Swestore-irods]]) is the excellent aggregated transfer rates possible. This is achieved by bypassing a central node and having transfers going directly to/from the storage elements if the protocol allows it. The Swestore Nationally Accessible Storage system can achieve aggregated transfer rates in excess of 100 Gigabit per second, but in practice this is limited by connectivity to each University (usually 10 Gbit/s) or a limited number of files (typically<br />
max 1 Gbit/s per file/connection).<br />
<br />
To protect against silent data corruption the dCache storage system checksums all stored data and periodically verifies the data using this checksum.<br />
<br />
The dCache system does NOT yet provide protection against user errors like inadvertent file deletions and so on. The [[Swestore-irods]] system provides this protection. Deleted files are moved to a trashcan.<br />
<br />
== Getting access ==<br />
; Apply for storage<br />
: Please follow the instructions on the [[Apply for storage on SweStore]] page.<br />
: iRODS is in <span style="color:#FF0000"> pilot phase </span><br><br />
: The iRODS system dosen't have the uptime and performance that our production system have.<br />
: We are still working with iRODS to get it into production.<br />
<br />
;Difference between dCache and iRODS user authentication<br />
:SweStore's dCache system uses eScience client certificates.<br />
:SweStore's iRODS system uses [http://www.yubico.com/products/yubikey-hardware/yubikey/ Yubikey] one-time passwords (OTP). With a simple touch of a button, a 44 character one-time password is generated and sent to the system. The user will be provided with a SweStore yubikey.<br />
:Yubikey has a status as pilot now. It can be changed in the future.<br />
<br />
; dCache usage - How to acquire an eScience client certificate<br />
: Follow the instructions on [[Grid_certificates#Requesting_a_certificate|Requesting a certificate]] to get your client certificate. This step can be performed while waiting for the storage application to be approved and processed. Of course, if you already have a valid eScience certificate you don't need to acquire another one.<br />
:; For Terena certificates<br />
:: If intending to access SweStore from a SNIC resource, please make sure you also [[Exporting_a_client_certificate|export the certificate]], transfer it to the intended SNIC resource and [[Preparing_a_client_certificate|prepare it for use with grid tools]] (not necessarily needed with ARC 3.x, see [[Grid_certificates#Creating_a_proxy_certificate_using_the_Firefox.2FThunderbird_credential_store|proxy certificates using Firefox credential store]]).<br />
:; For Nordugrid certificates<br />
:: Please make sure to also [[Requesting_a_grid_certificate_from_the_Nordugrid_CA#Installing_the_certificate_in_your_browser|install your client certificate in your browser]].<br />
:; Request membership in the SweGrid VO<br />
:: Follow the instructions on [[Grid_certificates#Requesting_membership_in_the_SweGrid_VO|Requesting membership in the SweGrid VO]] to get added to the SweGrid Virtual Organisation (VO) and request membership to your allocated storage project.<br />
<br />
; iRODS usage - How to acquire a SweStore yubikey<br />
:Please send an email to [mailto:support@swestore.se?subject=Yubikey support@swestore.se] and provide the shipping address to where the yubikey should be sent.<br><br />
:Yubikey has a status as pilot now. It can be changed in the future.<br />
<br />
== Support == <br />
<br />
If you have any issues using SweStore please do not hesitate to contact [mailto:support@swestore.se support@swestore.se].<br />
<br />
== dCache ==<br />
<br />
=== Access protocols ===<br />
; Currently supported protocols<br />
: GridFTP - gsiftp://gsiftp.swestore.se/<br />
: Storage Resource Manager - srm://srm.swegrid.se/<br />
: Hypertext Transfer Protocol (read-only), Web Distributed Authoring and Versioning - http://webdav.swestore.se/ (unauthenticated), https://webdav.swestore.se/<br />
: NFS4.1<br />
<br />
For authentication eScience certificates are used, which provides a higher level of security than legacy username/password schemes.<br />
<br />
=== Download and upload data ===<br />
; Interactive browsing and manipulation of single files<br />
: SweStore is accessible in your web browser in two ways, as a simple and reliable directory index interface at https://webdav.swestore.se/ and with a richer interactive file manager at https://webdav.swestore.se/browser/. '''Note''' that the interactive file manager has a lot of features and functions not supported in SweStore, only the basic file transfer features are supported.<br />
: To browse private data you need to have your certificate installed in your browser (default with Terena certificates, see above). Projects are organized under the <code>/snic</code> directory as <code><nowiki>https://webdav.swestore.se/snic/YOUR_PROJECT_NAME/</nowiki></code>.<br />
; Upload and delete data interactively or with automation<br />
There are several tools that are capable of using the protocols provided by SweStore national storage.<br />
For interactive usage on SNIC clusters we recommend using the ARC tools which should be installed on all SNIC resources.<br />
As an integration point for building scripts and automated systems we suggest using the curl program and library.<br />
: Use the ARC client. Please see the instructions for [[Accessing SweStore national storage with the ARC client]]. '''Recommended''' method when logged in on SNIC resources.<br />
: Use lftp. Please see the instructions for [[Accessing SweStore national storage with lftp]].<br />
: Use cURL. Please see the instructions for [[Accessing SweStore national storage with cURL]].<br />
: Use globus-url-copy. Please see the instructions for [[Accessing SweStore national storage with globus-url-copy]].<br />
<br />
=== Tools and scripts ===<br />
<br />
There exists a number of tools and utilities developed externally that can be useful. Here are some links:<br />
<br />
* [https://github.com/samuell/arc_tools ARC_Tools] - Convenience scripts for the arc client (Only a recursive rmdir so far).<br />
* [http://sourceforge.net/projects/arc-gui-clients ARC Graphical Clients] - Contains the ARC Storage Explorer (SweStore supported development).<br />
* Transfer script, [[SweStore/swetrans_arc|swetrans_arc]], provided by Adam Peplinski / Philipp Schlatter<br />
* [http://www.nordugrid.org/documents/SWIG-wrapped-ARC-Python-API.pdf Documentation of the ARC Python API (PDF)]<br />
<br />
=== Slides and more ===<br />
<br />
[http://docs.snic.se/wiki/Swestore/Lund_Seminar_Apr18 Slides and material from seminar for Lund users on April 18th]<br />
<br />
=== Usage monitoring ===<br />
* [http://status.swestore.se/munin/monitor/monitor/ Per Project Monitoring of Swestore usage]<br />
<br />
== iRODS ==<br />
<br />
Documentation of the SNIC iRODS system: [[Swestore-irods]].</div>Tom Langborg (NSC)https://snicdocs.nsc.liu.se/w/index.php?title=Swestore-dCache&diff=5880Swestore-dCache2014-06-18T06:30:29Z<p>Tom Langborg (NSC): /* Getting access */</p>
<hr />
<div>[[Category:Storage]]<br />
[[Category:SweStore]]<br />
SNIC is building a storage infrastructure to complement the computational resources.<br />
<br />
Many forms of automated measurements can produce large amounts of data. In scientific areas such as high energy physics (the Large Hadron Collider at CERN), climate modeling, bioinformatics, bioimaging etc., the demands for storage are increasing dramatically. To serve these and other user communities, SNIC has appointed a working group to design a storage strategy, taking into account the needs on many levels and creating a unified storage infrastructure, which is now being implemented.<br />
<br />
Swestore is in collaboration with [http://www.ecds.se/ ECDS], [http://snd.gu.se/ SND], [http://www.bioimaging.se/swedish_bioimaging_network/Welcome.html Bioimage Sweden], [http://www.bils.se/ BILS], [http://www.uppnex.uu.se/ UPPNEX],[http://wlcg.web.cern.ch/ WLCG], [http://www.nrm.se/ NaturHistoriska RiksMuseet].<br />
<br />
= National storage "SweStore"=<br />
The Swestore Nationally Accessible Storage, commonly called just Swestore, is a robust, flexible and expandable long term storage system aimed at storing large amounts of data produced by various Swedish research projects. It is based on the [http://www.dcache.org dCache] and [http://www.irods.org iRODS]<br />
storage systems.<br />
<br />
Swestore is distributed across the SNIC centres [http://www.c3se.chalmers.se/ C3SE], [http://www.hpc2n.umu.se/ HPC2N], [http://www.lunarc.lu.se/ Lunarc], [http://www.nsc.liu.se/ NSC], [http://www.pdc.kth.se PDC] and [http://www.uppmax.uu.se Uppmax]. Data is stored in two copies with each copy at a different SNIC centre. This enables the system to cope with a multitude of issues ranging from a simple crash of a storage element to losing an entire site while still providing access to the stored data. <br />
<br />
One of the major advantages to the distributed nature of dCache and iRODS ([[Swestore-irods]]) is the excellent aggregated transfer rates possible. This is achieved by bypassing a central node and having transfers going directly to/from the storage elements if the protocol allows it. The Swestore Nationally Accessible Storage system can achieve aggregated transfer rates in excess of 100 Gigabit per second, but in practice this is limited by connectivity to each University (usually 10 Gbit/s) or a limited number of files (typically<br />
max 1 Gbit/s per file/connection).<br />
<br />
To protect against silent data corruption the dCache storage system checksums all stored data and periodically verifies the data using this checksum.<br />
<br />
The dCache system does NOT yet provide protection against user errors like inadvertent file deletions and so on. The [[Swestore-irods]] system provides this protection. Deleted files are moved to a trashcan.<br />
<br />
== Getting access ==<br />
; Apply for storage<br />
: Please follow the instructions on the [[Apply for storage on SweStore]] page.<br />
: iRODS is in <span style="color:#FF0000"> pilot phase </span><br><br />
: The iRODS system dosen't have the uptime and performance that our production system have.<br />
<br />
;Difference between dCache and iRODS user authentication<br />
:SweStore's dCache system uses eScience client certificates.<br />
:SweStore's iRODS system uses [http://www.yubico.com/products/yubikey-hardware/yubikey/ Yubikey] one-time passwords (OTP). With a simple touch of a button, a 44 character one-time password is generated and sent to the system. The user will be provided with a SweStore yubikey.<br />
:Yubikey has a status as pilot now. It can be changed in the future.<br />
<br />
; dCache usage - How to acquire an eScience client certificate<br />
: Follow the instructions on [[Grid_certificates#Requesting_a_certificate|Requesting a certificate]] to get your client certificate. This step can be performed while waiting for the storage application to be approved and processed. Of course, if you already have a valid eScience certificate you don't need to acquire another one.<br />
:; For Terena certificates<br />
:: If intending to access SweStore from a SNIC resource, please make sure you also [[Exporting_a_client_certificate|export the certificate]], transfer it to the intended SNIC resource and [[Preparing_a_client_certificate|prepare it for use with grid tools]] (not necessarily needed with ARC 3.x, see [[Grid_certificates#Creating_a_proxy_certificate_using_the_Firefox.2FThunderbird_credential_store|proxy certificates using Firefox credential store]]).<br />
:; For Nordugrid certificates<br />
:: Please make sure to also [[Requesting_a_grid_certificate_from_the_Nordugrid_CA#Installing_the_certificate_in_your_browser|install your client certificate in your browser]].<br />
:; Request membership in the SweGrid VO<br />
:: Follow the instructions on [[Grid_certificates#Requesting_membership_in_the_SweGrid_VO|Requesting membership in the SweGrid VO]] to get added to the SweGrid Virtual Organisation (VO) and request membership to your allocated storage project.<br />
<br />
; iRODS usage - How to acquire a SweStore yubikey<br />
:Please send an email to [mailto:support@swestore.se?subject=Yubikey support@swestore.se] and provide the shipping address to where the yubikey should be sent.<br><br />
:Yubikey has a status as pilot now. It can be changed in the future.<br />
<br />
== Support == <br />
<br />
If you have any issues using SweStore please do not hesitate to contact [mailto:support@swestore.se support@swestore.se].<br />
<br />
== dCache ==<br />
<br />
=== Access protocols ===<br />
; Currently supported protocols<br />
: GridFTP - gsiftp://gsiftp.swestore.se/<br />
: Storage Resource Manager - srm://srm.swegrid.se/<br />
: Hypertext Transfer Protocol (read-only), Web Distributed Authoring and Versioning - http://webdav.swestore.se/ (unauthenticated), https://webdav.swestore.se/<br />
: NFS4.1<br />
<br />
For authentication eScience certificates are used, which provides a higher level of security than legacy username/password schemes.<br />
<br />
=== Download and upload data ===<br />
; Interactive browsing and manipulation of single files<br />
: SweStore is accessible in your web browser in two ways, as a simple and reliable directory index interface at https://webdav.swestore.se/ and with a richer interactive file manager at https://webdav.swestore.se/browser/. '''Note''' that the interactive file manager has a lot of features and functions not supported in SweStore, only the basic file transfer features are supported.<br />
: To browse private data you need to have your certificate installed in your browser (default with Terena certificates, see above). Projects are organized under the <code>/snic</code> directory as <code><nowiki>https://webdav.swestore.se/snic/YOUR_PROJECT_NAME/</nowiki></code>.<br />
; Upload and delete data interactively or with automation<br />
There are several tools that are capable of using the protocols provided by SweStore national storage.<br />
For interactive usage on SNIC clusters we recommend using the ARC tools which should be installed on all SNIC resources.<br />
As an integration point for building scripts and automated systems we suggest using the curl program and library.<br />
: Use the ARC client. Please see the instructions for [[Accessing SweStore national storage with the ARC client]]. '''Recommended''' method when logged in on SNIC resources.<br />
: Use lftp. Please see the instructions for [[Accessing SweStore national storage with lftp]].<br />
: Use cURL. Please see the instructions for [[Accessing SweStore national storage with cURL]].<br />
: Use globus-url-copy. Please see the instructions for [[Accessing SweStore national storage with globus-url-copy]].<br />
<br />
=== Tools and scripts ===<br />
<br />
There exists a number of tools and utilities developed externally that can be useful. Here are some links:<br />
<br />
* [https://github.com/samuell/arc_tools ARC_Tools] - Convenience scripts for the arc client (Only a recursive rmdir so far).<br />
* [http://sourceforge.net/projects/arc-gui-clients ARC Graphical Clients] - Contains the ARC Storage Explorer (SweStore supported development).<br />
* Transfer script, [[SweStore/swetrans_arc|swetrans_arc]], provided by Adam Peplinski / Philipp Schlatter<br />
* [http://www.nordugrid.org/documents/SWIG-wrapped-ARC-Python-API.pdf Documentation of the ARC Python API (PDF)]<br />
<br />
=== Slides and more ===<br />
<br />
[http://docs.snic.se/wiki/Swestore/Lund_Seminar_Apr18 Slides and material from seminar for Lund users on April 18th]<br />
<br />
=== Usage monitoring ===<br />
* [http://status.swestore.se/munin/monitor/monitor/ Per Project Monitoring of Swestore usage]<br />
<br />
== iRODS ==<br />
<br />
Documentation of the SNIC iRODS system: [[Swestore-irods]].</div>Tom Langborg (NSC)https://snicdocs.nsc.liu.se/w/index.php?title=Swestore-dCache&diff=5879Swestore-dCache2014-06-18T06:27:26Z<p>Tom Langborg (NSC): /* Getting access */</p>
<hr />
<div>[[Category:Storage]]<br />
[[Category:SweStore]]<br />
SNIC is building a storage infrastructure to complement the computational resources.<br />
<br />
Many forms of automated measurements can produce large amounts of data. In scientific areas such as high energy physics (the Large Hadron Collider at CERN), climate modeling, bioinformatics, bioimaging etc., the demands for storage are increasing dramatically. To serve these and other user communities, SNIC has appointed a working group to design a storage strategy, taking into account the needs on many levels and creating a unified storage infrastructure, which is now being implemented.<br />
<br />
Swestore is in collaboration with [http://www.ecds.se/ ECDS], [http://snd.gu.se/ SND], [http://www.bioimaging.se/swedish_bioimaging_network/Welcome.html Bioimage Sweden], [http://www.bils.se/ BILS], [http://www.uppnex.uu.se/ UPPNEX],[http://wlcg.web.cern.ch/ WLCG], [http://www.nrm.se/ NaturHistoriska RiksMuseet].<br />
<br />
= National storage "SweStore"=<br />
The Swestore Nationally Accessible Storage, commonly called just Swestore, is a robust, flexible and expandable long term storage system aimed at storing large amounts of data produced by various Swedish research projects. It is based on the [http://www.dcache.org dCache] and [http://www.irods.org iRODS]<br />
storage systems.<br />
<br />
Swestore is distributed across the SNIC centres [http://www.c3se.chalmers.se/ C3SE], [http://www.hpc2n.umu.se/ HPC2N], [http://www.lunarc.lu.se/ Lunarc], [http://www.nsc.liu.se/ NSC], [http://www.pdc.kth.se PDC] and [http://www.uppmax.uu.se Uppmax]. Data is stored in two copies with each copy at a different SNIC centre. This enables the system to cope with a multitude of issues ranging from a simple crash of a storage element to losing an entire site while still providing access to the stored data. <br />
<br />
One of the major advantages to the distributed nature of dCache and iRODS ([[Swestore-irods]]) is the excellent aggregated transfer rates possible. This is achieved by bypassing a central node and having transfers going directly to/from the storage elements if the protocol allows it. The Swestore Nationally Accessible Storage system can achieve aggregated transfer rates in excess of 100 Gigabit per second, but in practice this is limited by connectivity to each University (usually 10 Gbit/s) or a limited number of files (typically<br />
max 1 Gbit/s per file/connection).<br />
<br />
To protect against silent data corruption the dCache storage system checksums all stored data and periodically verifies the data using this checksum.<br />
<br />
The dCache system does NOT yet provide protection against user errors like inadvertent file deletions and so on. The [[Swestore-irods]] system provides this protection. Deleted files are moved to a trashcan.<br />
<br />
== Getting access ==<br />
; Apply for storage<br />
: Please follow the instructions on the [[Apply for storage on SweStore]] page.<br />
: iRODS is in <span style="color:#FF0000"> pilot phase </span><br><br />
<br />
;Difference between dCache and iRODS user authentication<br />
:SweStore's dCache system uses eScience client certificates.<br />
:SweStore's iRODS system uses [http://www.yubico.com/products/yubikey-hardware/yubikey/ Yubikey] one-time passwords (OTP). With a simple touch of a button, a 44 character one-time password is generated and sent to the system. The user will be provided with a SweStore yubikey.<br />
:Yubikey has a status as pilot now. It can be changed in the future.<br />
<br />
; dCache usage - How to acquire an eScience client certificate<br />
: Follow the instructions on [[Grid_certificates#Requesting_a_certificate|Requesting a certificate]] to get your client certificate. This step can be performed while waiting for the storage application to be approved and processed. Of course, if you already have a valid eScience certificate you don't need to acquire another one.<br />
:; For Terena certificates<br />
:: If intending to access SweStore from a SNIC resource, please make sure you also [[Exporting_a_client_certificate|export the certificate]], transfer it to the intended SNIC resource and [[Preparing_a_client_certificate|prepare it for use with grid tools]] (not necessarily needed with ARC 3.x, see [[Grid_certificates#Creating_a_proxy_certificate_using_the_Firefox.2FThunderbird_credential_store|proxy certificates using Firefox credential store]]).<br />
:; For Nordugrid certificates<br />
:: Please make sure to also [[Requesting_a_grid_certificate_from_the_Nordugrid_CA#Installing_the_certificate_in_your_browser|install your client certificate in your browser]].<br />
:; Request membership in the SweGrid VO<br />
:: Follow the instructions on [[Grid_certificates#Requesting_membership_in_the_SweGrid_VO|Requesting membership in the SweGrid VO]] to get added to the SweGrid Virtual Organisation (VO) and request membership to your allocated storage project.<br />
<br />
; iRODS usage - How to acquire a SweStore yubikey<br />
:Please send an email to [mailto:support@swestore.se?subject=Yubikey support@swestore.se] and provide the shipping address to where the yubikey should be sent.<br><br />
:Yubikey has a status as pilot now. It can be changed in the future.<br />
<br />
== Support == <br />
<br />
If you have any issues using SweStore please do not hesitate to contact [mailto:support@swestore.se support@swestore.se].<br />
<br />
== dCache ==<br />
<br />
=== Access protocols ===<br />
; Currently supported protocols<br />
: GridFTP - gsiftp://gsiftp.swestore.se/<br />
: Storage Resource Manager - srm://srm.swegrid.se/<br />
: Hypertext Transfer Protocol (read-only), Web Distributed Authoring and Versioning - http://webdav.swestore.se/ (unauthenticated), https://webdav.swestore.se/<br />
: NFS4.1<br />
<br />
For authentication eScience certificates are used, which provides a higher level of security than legacy username/password schemes.<br />
<br />
=== Download and upload data ===<br />
; Interactive browsing and manipulation of single files<br />
: SweStore is accessible in your web browser in two ways, as a simple and reliable directory index interface at https://webdav.swestore.se/ and with a richer interactive file manager at https://webdav.swestore.se/browser/. '''Note''' that the interactive file manager has a lot of features and functions not supported in SweStore, only the basic file transfer features are supported.<br />
: To browse private data you need to have your certificate installed in your browser (default with Terena certificates, see above). Projects are organized under the <code>/snic</code> directory as <code><nowiki>https://webdav.swestore.se/snic/YOUR_PROJECT_NAME/</nowiki></code>.<br />
; Upload and delete data interactively or with automation<br />
There are several tools that are capable of using the protocols provided by SweStore national storage.<br />
For interactive usage on SNIC clusters we recommend using the ARC tools which should be installed on all SNIC resources.<br />
As an integration point for building scripts and automated systems we suggest using the curl program and library.<br />
: Use the ARC client. Please see the instructions for [[Accessing SweStore national storage with the ARC client]]. '''Recommended''' method when logged in on SNIC resources.<br />
: Use lftp. Please see the instructions for [[Accessing SweStore national storage with lftp]].<br />
: Use cURL. Please see the instructions for [[Accessing SweStore national storage with cURL]].<br />
: Use globus-url-copy. Please see the instructions for [[Accessing SweStore national storage with globus-url-copy]].<br />
<br />
=== Tools and scripts ===<br />
<br />
There exists a number of tools and utilities developed externally that can be useful. Here are some links:<br />
<br />
* [https://github.com/samuell/arc_tools ARC_Tools] - Convenience scripts for the arc client (Only a recursive rmdir so far).<br />
* [http://sourceforge.net/projects/arc-gui-clients ARC Graphical Clients] - Contains the ARC Storage Explorer (SweStore supported development).<br />
* Transfer script, [[SweStore/swetrans_arc|swetrans_arc]], provided by Adam Peplinski / Philipp Schlatter<br />
* [http://www.nordugrid.org/documents/SWIG-wrapped-ARC-Python-API.pdf Documentation of the ARC Python API (PDF)]<br />
<br />
=== Slides and more ===<br />
<br />
[http://docs.snic.se/wiki/Swestore/Lund_Seminar_Apr18 Slides and material from seminar for Lund users on April 18th]<br />
<br />
=== Usage monitoring ===<br />
* [http://status.swestore.se/munin/monitor/monitor/ Per Project Monitoring of Swestore usage]<br />
<br />
== iRODS ==<br />
<br />
Documentation of the SNIC iRODS system: [[Swestore-irods]].</div>Tom Langborg (NSC)https://snicdocs.nsc.liu.se/w/index.php?title=Swestore-irods&diff=5878Swestore-irods2014-06-18T06:23:47Z<p>Tom Langborg (NSC): /* Getting access */</p>
<hr />
<div>= National Storage using iRODS =<br />
[[Category:Storage]]<br />
[[Category:SweStore]]<br />
<br />
== Getting access ==<br />
; Apply for storage<br />
: Please follow the instructions on the [[Apply for storage on SweStore]] page.<br />
<br />
;iRODS user authentication<br />
:SweStore's iRODS system uses [http://www.yubico.com/products/yubikey-hardware/yubikey/ Yubikey] one-time passwords (OTP). With a simple touch of a button, a 44 character one-time password is generated and sent to the system. The user will be provided with a SweStore yubikey.<br />
:Yubikey has a status as pilot now. It can be changed in the future.<br />
<br />
; iRODS usage <span style="color:#FF0000"> Pilot. </span><br> - How to acquire a SweStore yubikey<br />
:Please send an email to [mailto:support@swestore.se?subject=Yubikey support@swestore.se] and provide the shipping address to where the yubikey should be sent.<br><br />
:Yubikey has a status as pilot now. It can be changed in the future.<br />
<br />
== Support == <br />
<br />
If you have any issues using SweStore please do not hesitate to contact [mailto:support@swestore.se support@swestore.se].<br />
<br />
<span style="color:#FF0000"> Be careful with filenames. Don't use filename with '. </span><br><br />
<span style="color:#FF0000"> We have memory issues with many files or bigfiles transferring now. Be careful when moving files bigger than 50GB and 20000 files at one time. </span><br />
<br />
== Usage monitoring ==<br />
* [http://status.swestore.se/munin/monitor/monitor/ Per Project Monitoring of Swestore usage]<br />
<br />
== Supported clients ==<br />
<br />
: iDrop web - Point your Web browser to [https://iweb.swestore.se iweb.swestore.se]<br />
: E-iRODS iCommands 3.0 - Command line client [ftp://ftp.renci.org/pub/eirods/releases/3.0 Download E-iRODS icommands]<br />
<br />
SweStore iRODS uses PAM authentication and SweStore yubikeys. With a simple touch of a button, a 44 character one-time password is generated and sent to the system.<br />
<br />
<br />
=== Web GUI (iDrop web) ===<br />
Please see the specific documentation for [[iDrop web]].<br />
<br />
=== Community iRODS version 3.3 ===<br />
The community iRODS client version 3.3 also should work, with PAM authentication.<br><br />
Please install the OpenSSL include files and libraries:<br />
<pre><br />
$ sudo apt-get install libssl-dev (debian based system)<br />
# yum install openssl-devel (redhat-based systems)<br />
</pre><br />
Download irods 3.3 from https://www.irods.org/index.php/Downloads and unpack the tar.gz archive.<br />
<br />
Please enable the following defines in the Makefile iRODS/config/config.mk.in<br />
<pre><br />
PAM_AUTH = 1<br />
PAM_AUTH_NO_EXTEND = 1<br />
USE_SSL = 1 <br />
</pre><br />
Please run irodssetup to compile the irods community client with PAM authentication.<br />
<br />
== SweStore iRODS usage documentation ==<br />
<br />
To use the system you need to have the E-iRODS command line client installed or using iDROP web. <br />
<br />
=== Command line client ===<br />
<br />
For Linux systems the iRODS command line client is available as an installable package for various<br />
Linux platforms from the e-iRODS website downloads section.<br />
<br />
The command line client is natural to use for Unix users.<br />
There are versions of the usual ls, rm, mv, mkdir, pwd, rsync<br />
commands prefixed with an i for iRODS, i.e. irm, imv, imkdir etc.<br />
<br />
As expected iput and iget move files to and from the irods system.<br />
All these commands print short help when using the -h option.<br />
<br />
==== iCommands environment file ====<br />
<br />
There is an environment file .irodsEnv in the .irods subdirectory<br />
of the home directory ($HOME/.irods/.irodsEnv) which contains information where and how<br />
to access the iRODS metadata (iCAT) server.<br />
<br />
It looks like (placeholders are in <>):<br />
<pre><br />
irodsHost 'irods.swestore.se'<br />
irodsPort 1247<br />
irodsDefResource 'snicdefResc'<br />
irodsHome '/snicZone/proj/<PROJECT_NAME>'<br />
irodsCwd '/snicZone/proj/<PROJECT_NAME>'<br />
irodsUserName '<USERNAME>'<br />
irodsZone 'snicZone'<br />
irodsAuthScheme 'PAM'<br />
</pre><br />
<br />
The iCAT server is irods.swestore.se.<br />
The default irods zone name is snicZone.<br />
The default resource is snicdefResc.<br />
It is best to set the home directory to the same as the<br />
project directory, which would be a subdirectory under<br />
the /snicZone/proj directory tree.<br />
<br />
==== Yubikey instructions ====<br />
<br />
Prerequisite: A correct iCommands environment file, see above for instructions.<br />
<br />
# Insert the yubikey in an available USB-slot in your computer.<br />
# Type iinit<br />
# Touch the conductive surface on the yubikey to send an one-time password to the system. <br />
<br />
<pre><br />
<br />
$ iinit<br />
Enter your current PAM (system) password:<br />
$ ils<br />
/snicZone/proj/<projectname>:<br />
$<br />
</pre><br />
<br />
After that we can use the usual iCommands for 8 hours.<br />
<br />
More details on the iCommands are available at<br />
https://www.irods.org/index.php/icommands<br />
<br />
==== iCommands ====<br />
<br />
Having initialized the session as described above we can use tie iRODS versions<br />
of the basic Unix commands. The project directory is under /snicZone/proj, all<br />
members of the project should have write access to this directory. We can use<br />
the command<br />
<pre><br />
icd /snicZone/proj/projectname<br />
</pre><br />
to move to the project directory, or to change to an another project directory<br />
when we are members of more than one project.<br />
<br />
All commands give short help when invoked with the -h flag.<br />
<br />
To put files files into the iRODS system we can use:<br />
<pre><br />
iput localfile irodsfile<br />
</pre><br />
or, to put a whole directory tree:<br />
<pre><br />
iput -r localdirectory irodscollection<br />
</pre><br />
<br />
To load large amout of data it might be more advantageous to use<br />
<pre><br />
irsync -r localdirectory irodscollection<br />
</pre><br />
It might be a good idea to use -K so then checksums will be computed,<br />
stored and checked.<br />
<br />
To create directories (collections in iRODSspeak) we use:<br />
<pre><br />
imkdir collection<br />
</pre><br />
as it would be expected.<br />
<br />
To get those files back we can use<br />
<pre><br />
iget irodsfile localfile<br />
</pre><br />
or<br />
<pre><br />
irsync -r irodscollection localdirectory<br />
</pre><br />
<br />
To remove files we use:<br />
<pre><br />
irm<br />
</pre><br />
or<br />
<pre><br />
irm -r<br />
</pre><br />
<br />
Removing files like that would put the files into the trashcan (path: /snicZone/trash/).<br />
Time to time we would need to empty the trashcan, using<br />
<pre><br />
irmtrash<br />
</pre><br />
<br />
==== Using iCommands on SNIC HPC clusters ====<br />
<br />
On SNIC-clusters the icommands command line tools are either available in the PATH or by adding the irods module, e.g.<br />
: module load irods<br />
:If the irods commands are not available at the SNIC HPC cluster, please contact support@swestore.se<br />
We also need to setup the iCommands environment file $HOME/.irods/.irodsEnv<br />
<br />
=== Storage Project directory structure ===<br />
<br />
Your storage project is available at /snicZone/proj/<PROJECT NAME><br />
<br />
/snicZone/home/<USERNAME> is just a small home directory.<br />
<br />
=== iDROP web client ===<br />
<br />
See the [[iDrop web]] specific page.<br />
<br />
=== Upstream documentation ===<br />
Detailed documentation, papers and resources are available from<br />
the [http://www.eirods.org E-iRODS web site]<br />
<br />
[http://www.irods.org Community iRODS]<br />
<br />
[https://groups.google.com/d/forum/irod-chat User forum]</div>Tom Langborg (NSC)https://snicdocs.nsc.liu.se/w/index.php?title=Swestore-irods&diff=5722Swestore-irods2014-04-01T07:01:31Z<p>Tom Langborg (NSC): /* Support */</p>
<hr />
<div>= National Storage using iRODS =<br />
[[Category:Storage]]<br />
[[Category:SweStore]]<br />
<br />
== Getting access ==<br />
; Apply for storage<br />
: Please follow the instructions on the [[Apply for storage on SweStore]] page.<br />
<br />
;iRODS user authentication<br />
:SweStore's iRODS system uses [http://www.yubico.com/products/yubikey-hardware/yubikey/ Yubikey] one-time passwords (OTP). With a simple touch of a button, a 44 character one-time password is generated and sent to the system. The user will be provided with a SweStore yubikey.<br />
:Yubikey has a status as pilot now. It can be changed in the future.<br />
<br />
; iRODS usage - How to acquire a SweStore yubikey<br />
:Please send an email to [mailto:support@swestore.se?subject=Yubikey support@swestore.se] and provide the shipping address to where the yubikey should be sent.<br><br />
:Yubikey has a status as pilot now. It can be changed in the future.<br />
<br />
== Support == <br />
<br />
If you have any issues using SweStore please do not hesitate to contact [mailto:support@swestore.se support@swestore.se].<br />
<br />
<span style="color:#FF0000"> Be careful with filenames. Don't use filename with '. </span><br><br />
<span style="color:#FF0000"> We have memory issues with many files or bigfiles transferring now. Be careful when moving files bigger than 50GB and 20000 files at one time. </span><br />
<br />
== Usage monitoring ==<br />
* [http://status.swestore.se/munin/monitor/monitor/ Per Project Monitoring of Swestore usage]<br />
<br />
== Supported clients ==<br />
<br />
: iDrop web - Point your Web browser to [https://iweb.swestore.se iweb.swestore.se]<br />
: E-iRODS iCommands 3.0 - Command line client [ftp://ftp.renci.org/pub/eirods/releases/3.0 Download E-iRODS icommands]<br />
<br />
SweStore iRODS uses PAM authentication and SweStore yubikeys. With a simple touch of a button, a 44 character one-time password is generated and sent to the system.<br />
<br />
<br />
=== Web GUI (iDrop web) ===<br />
Please see the specific documentation for [[iDrop web]].<br />
<br />
=== Community iRODS version 3.3 ===<br />
The community iRODS client version 3.3 also should work, with PAM authentication.<br><br />
Please install the OpenSSL include files and libraries:<br />
<pre><br />
$ sudo apt-get install libssl-dev (debian based system)<br />
# yum install openssl-devel (redhat-based systems)<br />
</pre><br />
Download irods 3.3 from https://www.irods.org/index.php/Downloads and unpack the tar.gz archive.<br />
<br />
Please enable the following defines in the Makefile iRODS/config/config.mk.in<br />
<pre><br />
PAM_AUTH = 1<br />
PAM_AUTH_NO_EXTEND = 1<br />
USE_SSL = 1 <br />
</pre><br />
Please run irodssetup to compile the irods community client with PAM authentication.<br />
<br />
== SweStore iRODS usage documentation ==<br />
<br />
To use the system you need to have the E-iRODS command line client installed or using iDROP web. <br />
<br />
=== Command line client ===<br />
<br />
For Linux systems the iRODS command line client is available as an installable package for various<br />
Linux platforms from the e-iRODS website downloads section.<br />
<br />
The command line client is natural to use for Unix users.<br />
There are versions of the usual ls, rm, mv, mkdir, pwd, rsync<br />
commands prefixed with an i for iRODS, i.e. irm, imv, imkdir etc.<br />
<br />
As expected iput and iget move files to and from the irods system.<br />
All these commands print short help when using the -h option.<br />
<br />
==== iCommands environment file ====<br />
<br />
There is an environment file .irodsEnv in the .irods subdirectory<br />
of the home directory ($HOME/.irods/.irodsEnv) which contains information where and how<br />
to access the iRODS metadata (iCAT) server.<br />
<br />
It looks like (placeholders are in <>):<br />
<pre><br />
irodsHost 'irods.swestore.se'<br />
irodsPort 1247<br />
irodsDefResource 'snicdefResc'<br />
irodsHome '/snicZone/proj/<PROJECT_NAME>'<br />
irodsCwd '/snicZone/proj/<PROJECT_NAME>'<br />
irodsUserName '<USERNAME>'<br />
irodsZone 'snicZone'<br />
irodsAuthScheme 'PAM'<br />
</pre><br />
<br />
The iCAT server is irods.swestore.se.<br />
The default irods zone name is snicZone.<br />
The default resource is snicdefResc.<br />
It is best to set the home directory to the same as the<br />
project directory, which would be a subdirectory under<br />
the /snicZone/proj directory tree.<br />
<br />
==== Yubikey instructions ====<br />
<br />
Prerequisite: A correct iCommands environment file, see above for instructions.<br />
<br />
# Insert the yubikey in an available USB-slot in your computer.<br />
# Type iinit<br />
# Touch the conductive surface on the yubikey to send an one-time password to the system. <br />
<br />
<pre><br />
<br />
$ iinit<br />
Enter your current PAM (system) password:<br />
$ ils<br />
/snicZone/proj/<projectname>:<br />
$<br />
</pre><br />
<br />
After that we can use the usual iCommands for 8 hours.<br />
<br />
More details on the iCommands are available at<br />
https://www.irods.org/index.php/icommands<br />
<br />
==== iCommands ====<br />
<br />
Having initialized the session as described above we can use tie iRODS versions<br />
of the basic Unix commands. The project directory is under /snicZone/proj, all<br />
members of the project should have write access to this directory. We can use<br />
the command<br />
<pre><br />
icd /snicZone/proj/projectname<br />
</pre><br />
to move to the project directory, or to change to an another project directory<br />
when we are members of more than one project.<br />
<br />
All commands give short help when invoked with the -h flag.<br />
<br />
To put files files into the iRODS system we can use:<br />
<pre><br />
iput localfile irodsfile<br />
</pre><br />
or, to put a whole directory tree:<br />
<pre><br />
iput -r localdirectory irodscollection<br />
</pre><br />
<br />
To load large amout of data it might be more advantageous to use<br />
<pre><br />
irsync -r localdirectory irodscollection<br />
</pre><br />
It might be a good idea to use -K so then checksums will be computed,<br />
stored and checked.<br />
<br />
To create directories (collections in iRODSspeak) we use:<br />
<pre><br />
imkdir collection<br />
</pre><br />
as it would be expected.<br />
<br />
To get those files back we can use<br />
<pre><br />
iget irodsfile localfile<br />
</pre><br />
or<br />
<pre><br />
irsync -r irodscollection localdirectory<br />
</pre><br />
<br />
To remove files we use:<br />
<pre><br />
irm<br />
</pre><br />
or<br />
<pre><br />
irm -r<br />
</pre><br />
<br />
Removing files like that would put the files into the trashcan (path: /snicZone/trash/).<br />
Time to time we would need to empty the trashcan, using<br />
<pre><br />
irmtrash<br />
</pre><br />
<br />
==== Using iCommands on SNIC HPC clusters ====<br />
<br />
On SNIC-clusters the icommands command line tools are either available in the PATH or by adding the irods module, e.g.<br />
: module load irods<br />
:If the irods commands are not available at the SNIC HPC cluster, please contact support@swestore.se<br />
We also need to setup the iCommands environment file $HOME/.irods/.irodsEnv<br />
<br />
=== Storage Project directory structure ===<br />
<br />
Your storage project is available at /snicZone/proj/<PROJECT NAME><br />
<br />
/snicZone/home/<USERNAME> is just a small home directory.<br />
<br />
=== iDROP web client ===<br />
<br />
See the [[iDrop web]] specific page.<br />
<br />
=== Upstream documentation ===<br />
Detailed documentation, papers and resources are available from<br />
the [http://www.eirods.org E-iRODS web site]<br />
<br />
[http://www.irods.org Community iRODS]<br />
<br />
[https://groups.google.com/d/forum/irod-chat User forum]</div>Tom Langborg (NSC)https://snicdocs.nsc.liu.se/w/index.php?title=Swestore-irods&diff=5721Swestore-irods2014-03-28T14:57:10Z<p>Tom Langborg (NSC): /* Support */</p>
<hr />
<div>= National Storage using iRODS =<br />
[[Category:Storage]]<br />
[[Category:SweStore]]<br />
<br />
== Getting access ==<br />
; Apply for storage<br />
: Please follow the instructions on the [[Apply for storage on SweStore]] page.<br />
<br />
;iRODS user authentication<br />
:SweStore's iRODS system uses [http://www.yubico.com/products/yubikey-hardware/yubikey/ Yubikey] one-time passwords (OTP). With a simple touch of a button, a 44 character one-time password is generated and sent to the system. The user will be provided with a SweStore yubikey.<br />
:Yubikey has a status as pilot now. It can be changed in the future.<br />
<br />
; iRODS usage - How to acquire a SweStore yubikey<br />
:Please send an email to [mailto:support@swestore.se?subject=Yubikey support@swestore.se] and provide the shipping address to where the yubikey should be sent.<br><br />
:Yubikey has a status as pilot now. It can be changed in the future.<br />
<br />
== Support == <br />
<br />
If you have any issues using SweStore please do not hesitate to contact [mailto:support@swestore.se support@swestore.se].<br />
<br />
<span style="color:#FF0000"> Be careful with filenames. Don't use filename with '. </span><br />
<br />
== Usage monitoring ==<br />
* [http://status.swestore.se/munin/monitor/monitor/ Per Project Monitoring of Swestore usage]<br />
<br />
== Supported clients ==<br />
<br />
: iDrop web - Point your Web browser to [https://iweb.swestore.se iweb.swestore.se]<br />
: E-iRODS iCommands 3.0 - Command line client [ftp://ftp.renci.org/pub/eirods/releases/3.0 Download E-iRODS icommands]<br />
<br />
SweStore iRODS uses PAM authentication and SweStore yubikeys. With a simple touch of a button, a 44 character one-time password is generated and sent to the system.<br />
<br />
<br />
=== Web GUI (iDrop web) ===<br />
Please see the specific documentation for [[iDrop web]].<br />
<br />
=== Community iRODS version 3.3 ===<br />
The community iRODS client version 3.3 also should work, with PAM authentication.<br><br />
Please install the OpenSSL include files and libraries:<br />
<pre><br />
$ sudo apt-get install libssl-dev (debian based system)<br />
# yum install openssl-devel (redhat-based systems)<br />
</pre><br />
Download irods 3.3 from https://www.irods.org/index.php/Downloads and unpack the tar.gz archive.<br />
<br />
Please enable the following defines in the Makefile iRODS/config/config.mk.in<br />
<pre><br />
PAM_AUTH = 1<br />
PAM_AUTH_NO_EXTEND = 1<br />
USE_SSL = 1 <br />
</pre><br />
Please run irodssetup to compile the irods community client with PAM authentication.<br />
<br />
== SweStore iRODS usage documentation ==<br />
<br />
To use the system you need to have the E-iRODS command line client installed or using iDROP web. <br />
<br />
=== Command line client ===<br />
<br />
For Linux systems the iRODS command line client is available as an installable package for various<br />
Linux platforms from the e-iRODS website downloads section.<br />
<br />
The command line client is natural to use for Unix users.<br />
There are versions of the usual ls, rm, mv, mkdir, pwd, rsync<br />
commands prefixed with an i for iRODS, i.e. irm, imv, imkdir etc.<br />
<br />
As expected iput and iget move files to and from the irods system.<br />
All these commands print short help when using the -h option.<br />
<br />
==== iCommands environment file ====<br />
<br />
There is an environment file .irodsEnv in the .irods subdirectory<br />
of the home directory ($HOME/.irods/.irodsEnv) which contains information where and how<br />
to access the iRODS metadata (iCAT) server.<br />
<br />
It looks like (placeholders are in <>):<br />
<pre><br />
irodsHost 'irods.swestore.se'<br />
irodsPort 1247<br />
irodsDefResource 'snicdefResc'<br />
irodsHome '/snicZone/proj/<PROJECT_NAME>'<br />
irodsCwd '/snicZone/proj/<PROJECT_NAME>'<br />
irodsUserName '<USERNAME>'<br />
irodsZone 'snicZone'<br />
irodsAuthScheme 'PAM'<br />
</pre><br />
<br />
The iCAT server is irods.swestore.se.<br />
The default irods zone name is snicZone.<br />
The default resource is snicdefResc.<br />
It is best to set the home directory to the same as the<br />
project directory, which would be a subdirectory under<br />
the /snicZone/proj directory tree.<br />
<br />
==== Yubikey instructions ====<br />
<br />
Prerequisite: A correct iCommands environment file, see above for instructions.<br />
<br />
# Insert the yubikey in an available USB-slot in your computer.<br />
# Type iinit<br />
# Touch the conductive surface on the yubikey to send an one-time password to the system. <br />
<br />
<pre><br />
<br />
$ iinit<br />
Enter your current PAM (system) password:<br />
$ ils<br />
/snicZone/proj/<projectname>:<br />
$<br />
</pre><br />
<br />
After that we can use the usual iCommands for 8 hours.<br />
<br />
More details on the iCommands are available at<br />
https://www.irods.org/index.php/icommands<br />
<br />
==== iCommands ====<br />
<br />
Having initialized the session as described above we can use tie iRODS versions<br />
of the basic Unix commands. The project directory is under /snicZone/proj, all<br />
members of the project should have write access to this directory. We can use<br />
the command<br />
<pre><br />
icd /snicZone/proj/projectname<br />
</pre><br />
to move to the project directory, or to change to an another project directory<br />
when we are members of more than one project.<br />
<br />
All commands give short help when invoked with the -h flag.<br />
<br />
To put files files into the iRODS system we can use:<br />
<pre><br />
iput localfile irodsfile<br />
</pre><br />
or, to put a whole directory tree:<br />
<pre><br />
iput -r localdirectory irodscollection<br />
</pre><br />
<br />
To load large amout of data it might be more advantageous to use<br />
<pre><br />
irsync -r localdirectory irodscollection<br />
</pre><br />
It might be a good idea to use -K so then checksums will be computed,<br />
stored and checked.<br />
<br />
To create directories (collections in iRODSspeak) we use:<br />
<pre><br />
imkdir collection<br />
</pre><br />
as it would be expected.<br />
<br />
To get those files back we can use<br />
<pre><br />
iget irodsfile localfile<br />
</pre><br />
or<br />
<pre><br />
irsync -r irodscollection localdirectory<br />
</pre><br />
<br />
To remove files we use:<br />
<pre><br />
irm<br />
</pre><br />
or<br />
<pre><br />
irm -r<br />
</pre><br />
<br />
Removing files like that would put the files into the trashcan (path: /snicZone/trash/).<br />
Time to time we would need to empty the trashcan, using<br />
<pre><br />
irmtrash<br />
</pre><br />
<br />
==== Using iCommands on SNIC HPC clusters ====<br />
<br />
On SNIC-clusters the icommands command line tools are either available in the PATH or by adding the irods module, e.g.<br />
: module load irods<br />
:If the irods commands are not available at the SNIC HPC cluster, please contact support@swestore.se<br />
We also need to setup the iCommands environment file $HOME/.irods/.irodsEnv<br />
<br />
=== Storage Project directory structure ===<br />
<br />
Your storage project is available at /snicZone/proj/<PROJECT NAME><br />
<br />
/snicZone/home/<USERNAME> is just a small home directory.<br />
<br />
=== iDROP web client ===<br />
<br />
See the [[iDrop web]] specific page.<br />
<br />
=== Upstream documentation ===<br />
Detailed documentation, papers and resources are available from<br />
the [http://www.eirods.org E-iRODS web site]<br />
<br />
[http://www.irods.org Community iRODS]<br />
<br />
[https://groups.google.com/d/forum/irod-chat User forum]</div>Tom Langborg (NSC)https://snicdocs.nsc.liu.se/w/index.php?title=Swestore-dCache&diff=5671Swestore-dCache2013-12-18T13:07:59Z<p>Tom Langborg (NSC): </p>
<hr />
<div>[[Category:Storage]]<br />
[[Category:SweStore]]<br />
SNIC is building a storage infrastructure to complement the computational resources.<br />
<br />
Many forms of automated measurements can produce large amounts of data. In scientific areas such as high energy physics (the Large Hadron Collider at CERN), climate modeling, bioinformatics, bioimaging etc., the demands for storage are increasing dramatically. To serve these and other user communities, SNIC has appointed a working group to design a storage strategy, taking into account the needs on many levels and creating a unified storage infrastructure, which is now being implemented.<br />
<br />
Swestore is in collaboration with [http://www.ecds.se/ ECDS], [http://snd.gu.se/ SND], Bioimage Sweden, [http://www.bils.se/ BILS], [http://www.uppnex.uu.se/ UPPNEX],[http://wlcg.web.cern.ch/ WLCG], [http://www.nrm.se/ NaturHistoriska RiksMuseet].<br />
<br />
= National storage "SweStore"=<br />
The Swestore Nationally Accessible Storage, commonly called just Swestore, is a robust, flexible and expandable long term storage system aimed at storing large amounts of data produced by various Swedish research projects. It is based on the [http://www.dcache.org dCache] and [http://www.irods.org iRODS]<br />
storage systems.<br />
<br />
Swestore is distributed across the SNIC centres [http://www.c3se.chalmers.se/ C3SE], [http://www.hpc2n.umu.se/ HPC2N], [http://www.lunarc.lu.se/ Lunarc], [http://www.nsc.liu.se/ NSC], [http://www.pdc.kth.se PDC] and [http://www.uppmax.uu.se Uppmax]. Data is stored in two copies with each copy at a different SNIC centre. This enables the system to cope with a multitude of issues ranging from a simple crash of a storage element to losing an entire site while still providing access to the stored data. <br />
<br />
One of the major advantages to the distributed nature of dCache and iRODS ([[Swestore-irods]) is the excellent aggregated transfer rates possible. This is achieved by bypassing a central node and having transfers going directly to/from the storage elements if the protocol allows it. The Swestore Nationally Accessible Storage system can achieve aggregated transfer rates in excess of 100 Gigabit per second, but in practice this is limited by connectivity to each University (usually 10 Gbit/s) or a limited number of files (typically<br />
max 1 Gbit/s per file/connection).<br />
<br />
To protect against silent data corruption the dCache storage system checksums all stored data and periodically verifies the data using this checksum.<br />
<br />
The dCache system does NOT yet provide protection against user errors like inadvertent file deletions and so on. The [[Swestore-irods]] system provides this protection. Deleted files are moved to a trashcan.<br />
<br />
== Getting access ==<br />
; Apply for storage<br />
: Please follow the instructions on the [[Apply for storage on SweStore]] page.<br />
<br />
;Difference between dCache and iRODS user authentication<br />
:SweStore's dCache system uses eScience client certificates.<br />
:SweStore's iRODS system uses [http://www.yubico.com/products/yubikey-hardware/yubikey/ Yubikey] one-time passwords (OTP). With a simple touch of a button, a 44 character one-time password is generated and sent to the system. The user will be provided with a SweStore yubikey.<br />
:Yubikey has a status as pilot now. It can be changed in the future.<br />
<br />
; dCache usage - How to acquire an eScience client certificate<br />
: Follow the instructions on [[Grid_certificates#Requesting_a_certificate|Requesting a certificate]] to get your client certificate. This step can be performed while waiting for the storage application to be approved and processed. Of course, if you already have a valid eScience certificate you don't need to acquire another one.<br />
:; For Terena certificates<br />
:: If intending to access SweStore from a SNIC resource, please make sure you also [[Exporting_a_client_certificate|export the certificate]], transfer it to the intended SNIC resource and [[Preparing_a_client_certificate|prepare it for use with grid tools]] (not necessarily needed with ARC 3.x, see [[Grid_certificates#Creating_a_proxy_certificate_using_the_Firefox.2FThunderbird_credential_store|proxy certificates using Firefox credential store]]).<br />
:; For Nordugrid certificates<br />
:: Please make sure to also [[Requesting_a_grid_certificate_from_the_Nordugrid_CA#Installing_the_certificate_in_your_browser|install your client certificate in your browser]].<br />
:; Request membership in the SweGrid VO<br />
:: Follow the instructions on [[Grid_certificates#Requesting_membership_in_the_SweGrid_VO|Requesting membership in the SweGrid VO]] to get added to the SweGrid Virtual Organisation (VO) and request membership to your allocated storage project.<br />
<br />
; iRODS usage - How to acquire a SweStore yubikey<br />
:Please send an email to [mailto:support@swestore.se?subject=Yubikey support@swestore.se] and provide the shipping address to where the yubikey should be sent.<br><br />
:Yubikey has a status as pilot now. It can be changed in the future.<br />
<br />
== Support == <br />
<br />
If you have any issues using SweStore please do not hesitate to contact [mailto:support@swestore.se support@swestore.se].<br />
<br />
== dCache ==<br />
<br />
=== Access protocols ===<br />
; Currently supported protocols<br />
: GridFTP - gsiftp://gsiftp.swestore.se/<br />
: Storage Resource Manager - srm://srm.swegrid.se/<br />
: Hypertext Transfer Protocol (read-only), Web Distributed Authoring and Versioning - http://webdav.swestore.se/ (unauthenticated), https://webdav.swestore.se/<br />
: NFS4.1<br />
<br />
For authentication eScience certificates are used, which provides a higher level of security than legacy username/password schemes.<br />
<br />
=== Download and upload data ===<br />
; Interactive browsing and manipulation of single files<br />
: SweStore is accessible in your web browser in two ways, as a simple and reliable directory index interface at https://webdav.swestore.se/ and with a richer interactive file manager at https://webdav.swestore.se/browser/. '''Note''' that the interactive file manager has a lot of features and functions not supported in SweStore, only the basic file transfer features are supported.<br />
: To browse private data you need to have your certificate installed in your browser (default with Terena certificates, see above). Projects are organized under the <code>/snic</code> directory as <code><nowiki>https://webdav.swestore.se/snic/YOUR_PROJECT_NAME/</nowiki></code>.<br />
; Upload and delete data interactively or with automation<br />
There are several tools that are capable of using the protocols provided by SweStore national storage.<br />
For interactive usage on SNIC clusters we recommend using the ARC tools which should be installed on all SNIC resources.<br />
As an integration point for building scripts and automated systems we suggest using the curl program and library.<br />
: Use the ARC client. Please see the instructions for [[Accessing SweStore national storage with the ARC client]]. '''Recommended''' method when logged in on SNIC resources.<br />
: Use lftp. Please see the instructions for [[Accessing SweStore national storage with lftp]].<br />
: Use cURL. Please see the instructions for [[Accessing SweStore national storage with cURL]].<br />
: Use globus-url-copy. Please see the instructions for [[Accessing SweStore national storage with globus-url-copy]].<br />
<br />
=== Tools and scripts ===<br />
<br />
There exists a number of tools and utilities developed externally that can be useful. Here are some links:<br />
<br />
* [https://github.com/samuell/arc_tools ARC_Tools] - Convenience scripts for the arc client (Only a recursive rmdir so far).<br />
* [http://sourceforge.net/projects/arc-gui-clients ARC Graphical Clients] - Contains the ARC Storage Explorer (SweStore supported development).<br />
* Transfer script, [[SweStore/swetrans_arc|swetrans_arc]], provided by Adam Peplinski / Philipp Schlatter<br />
* [http://www.nordugrid.org/documents/SWIG-wrapped-ARC-Python-API.pdf Documentation of the ARC Python API (PDF)]<br />
<br />
=== Slides and more ===<br />
<br />
[http://docs.snic.se/wiki/Swestore/Lund_Seminar_Apr18 Slides and material from seminar for Lund users on April 18th]<br />
<br />
=== Usage monitoring ===<br />
* [http://status.swestore.se/munin/monitor/monitor/ Per Project Monitoring of Swestore usage]<br />
<br />
== iRODS ==<br />
<br />
Documentation of the SNIC iRODS system: [[Swestore-irods]].</div>Tom Langborg (NSC)https://snicdocs.nsc.liu.se/w/index.php?title=Swestore-dCache&diff=5670Swestore-dCache2013-12-18T13:06:46Z<p>Tom Langborg (NSC): /* National storage */</p>
<hr />
<div>[[Category:Storage]]<br />
[[Category:SweStore]]<br />
SNIC is building a storage infrastructure to complement the computational resources.<br />
<br />
Many forms of automated measurements can produce large amounts of data. In scientific areas such as high energy physics (the Large Hadron Collider at CERN), climate modeling, bioinformatics, bioimaging etc., the demands for storage are increasing dramatically. To serve these and other user communities, SNIC has appointed a working group to design a storage strategy, taking into account the needs on many levels and creating a unified storage infrastructure, which is now being implemented.<br />
<br />
Swestore is in collaboration with [http://www.ecds.se/ ECDS], [http://snd.gu.se/ SND], Bioimage Sweden, [http://www.bils.se/ BILS], [http://www.uppnex.uu.se/ UPPNEX],[http://wlcg.web.cern.ch/ WLCG], [http://www.nrm.se/ NaturHistoriska RiksMuseet].<br />
<br />
= National storage "SweStore"=<br />
The Swestore Nationally Accessible Storage, commonly called just Swestore, is a robust, flexible and expandable long term storage system aimed at storing large amounts of data produced by various Swedish research projects. It is based on the [http://www.dcache.org dCache] and [http://www.irods.org iRODS]<br />
storage systems.<br />
<br />
Swestore is distributed across the SNIC centres [http://www.c3se.chalmers.se/ C3SE], [http://www.hpc2n.umu.se/ HPC2N], [http://www.lunarc.lu.se/ Lunarc], [http://www.nsc.liu.se/ NSC], [http://www.pdc.kth.se PDC] and [http://www.uppmax.uu.se Uppmax]. Data is stored in two copies with each copy at a different SNIC centre. This enables the system to cope with a multitude of issues ranging from a simple crash of a storage element to losing an entire site while still providing access to the stored data. <br />
<br />
One of the major advantages to the distributed nature of dCache and iRODS ([[Swestore-irods]) is the excellent aggregated transfer rates possible. This is achieved by bypassing a central node and having transfers going directly to/from the storage elements if the protocol allows it. The Swestore Nationally Accessible Storage system can achieve aggregated transfer rates in excess of 100 Gigabit per second, but in practice this is limited by connectivity to each University (usually 10 Gbit/s) or a limited number of files (typically<br />
max 1 Gbit/s per file/connection).<br />
<br />
To protect against silent data corruption the dCache storage system checksums all stored data and periodically verifies the data using this checksum.<br />
<br />
The dCache system does NOT yet provide protection against user errors like inadvertent file deletions and so on. The [[Swestore-irods]] system provides this protection. Deleted files are moved to a trashcan.<br />
<br />
== Getting access ==<br />
; Apply for storage<br />
: Please follow the instructions on the [[Apply for storage on SweStore]] page.<br />
<br />
;Difference between dCache and iRODS user authentication<br />
:SweStore's dCache system uses eScience client certificates.<br />
:SweStore's iRODS system uses [http://www.yubico.com/products/yubikey-hardware/yubikey/ Yubikey] one-time passwords (OTP). With a simple touch of a button, a 44 character one-time password is generated and sent to the system. The user will be provided with a SweStore yubikey.<br />
:Yubikey has a status as pilot now. It can be changed in the future.<br />
<br />
; dCache usage - How to acquire an eScience client certificate<br />
: Follow the instructions on [[Grid_certificates#Requesting_a_certificate|Requesting a certificate]] to get your client certificate. This step can be performed while waiting for the storage application to be approved and processed. Of course, if you already have a valid eScience certificate you don't need to acquire another one.<br />
:; For Terena certificates<br />
:: If intending to access SweStore from a SNIC resource, please make sure you also [[Exporting_a_client_certificate|export the certificate]], transfer it to the intended SNIC resource and [[Preparing_a_client_certificate|prepare it for use with grid tools]] (not necessarily needed with ARC 3.x, see [[Grid_certificates#Creating_a_proxy_certificate_using_the_Firefox.2FThunderbird_credential_store|proxy certificates using Firefox credential store]]).<br />
:; For Nordugrid certificates<br />
:: Please make sure to also [[Requesting_a_grid_certificate_from_the_Nordugrid_CA#Installing_the_certificate_in_your_browser|install your client certificate in your browser]].<br />
:; Request membership in the SweGrid VO<br />
:: Follow the instructions on [[Grid_certificates#Requesting_membership_in_the_SweGrid_VO|Requesting membership in the SweGrid VO]] to get added to the SweGrid Virtual Organisation (VO) and request membership to your allocated storage project.<br />
<br />
; iRODS usage - How to acquire a SweStore yubikey<br />
:Please send an email to [mailto:support@swestore.se?subject=Yubikey support@swestore.se] and provide the shipping address to where the yubikey should be sent.<br><br />
:Yubikey has a status as pilot now. It can be changed in the future.<br />
<br />
== Support == <br />
<br />
If you have any issues using SweStore please do not hesitate to contact [mailto:support@swestore.se support@swestore.se].<br />
<br />
== dCache ==<br />
<br />
=== Access protocols ===<br />
; Currently supported protocols<br />
: GridFTP - gsiftp://gsiftp.swestore.se/<br />
: Storage Resource Manager - srm://srm.swegrid.se/<br />
: Hypertext Transfer Protocol (read-only), Web Distributed Authoring and Versioning - http://webdav.swestore.se/ (unauthenticated), https://webdav.swestore.se/<br />
: NFS4.1<br />
<br />
For authentication eScience certificates are used, which provides a higher level of security than legacy username/password schemes.<br />
<br />
=== Download and upload data ===<br />
; Interactive browsing and manipulation of single files<br />
: SweStore is accessible in your web browser in two ways, as a simple and reliable directory index interface at https://webdav.swestore.se/ and with a richer interactive file manager at https://webdav.swestore.se/browser/. '''Note''' that the interactive file manager has a lot of features and functions not supported in SweStore, only the basic file transfer features are supported.<br />
: To browse private data you need to have your certificate installed in your browser (default with Terena certificates, see above). Projects are organized under the <code>/snic</code> directory as <code><nowiki>https://webdav.swestore.se/snic/YOUR_PROJECT_NAME/</nowiki></code>.<br />
; Upload and delete data interactively or with automation<br />
There are several tools that are capable of using the protocols provided by SweStore national storage.<br />
For interactive usage on SNIC clusters we recommend using the ARC tools which should be installed on all SNIC resources.<br />
As an integration point for building scripts and automated systems we suggest using the curl program and library.<br />
: Use the ARC client. Please see the instructions for [[Accessing SweStore national storage with the ARC client]]. '''Recommended''' method when logged in on SNIC resources.<br />
: Use lftp. Please see the instructions for [[Accessing SweStore national storage with lftp]].<br />
: Use cURL. Please see the instructions for [[Accessing SweStore national storage with cURL]].<br />
: Use globus-url-copy. Please see the instructions for [[Accessing SweStore national storage with globus-url-copy]].<br />
<br />
=== Tools and scripts ===<br />
<br />
There exists a number of tools and utilities developed externally that can be useful. Here are some links:<br />
<br />
* [https://github.com/samuell/arc_tools ARC_Tools] - Convenience scripts for the arc client (Only a recursive rmdir so far).<br />
* [http://sourceforge.net/projects/arc-gui-clients ARC Graphical Clients] - Contains the ARC Storage Explorer (SweStore supported development).<br />
* Transfer script, [[SweStore/swetrans_arc|swetrans_arc]], provided by Adam Peplinski / Philipp Schlatter<br />
* [http://www.nordugrid.org/documents/SWIG-wrapped-ARC-Python-API.pdf Documentation of the ARC Python API (PDF)]<br />
<br />
=== Slides and more ===<br />
<br />
[http://docs.snic.se/wiki/Swestore/Lund_Seminar_Apr18 Slides and material from seminar for Lund users on April 18th]<br />
<br />
=== Usage monitoring ===<br />
* [http://status.swestore.se/munin/monitor/monitor/ Per Project Monitoring of Swestore usage]<br />
<br />
== iRODS ==<br />
<br />
Documentation of the SNIC iRODS system: [[Swestore-irods]].<br />
<br />
= Center storage =<br />
Centre storage, as defined by the SNIC storage group, is a storage solution that lives independently of the computational resources and can be accessed from all such resources at a centre. Key features include the ability to access the same filesystem the same way on all computational resources at a centre, and a unified structure and nomenclature for all centra. Unlike cluster storage which is tightly associated with a single cluster, and thus has a limited life-time, centre storage does not require the users to migrate their own data when clusters are decommissioned, not even when the storage hardware itself is being replaced.<br />
<br />
== Unified environment ==<br />
To make the usage more transparent for SNIC users, a set of environment variables are available on all SNIC resources:<br />
<br />
* <code>SNIC_BACKUP</code> – the user's primary directory at the centre<br>(the part of the centre storage that is backed up)<br />
* <code>SNIC_NOBACKUP</code> – recommended directory for project storage without backup<br>(also on the centre storage)<br />
* <code>SNIC_TMP</code> – recommended directory for best performance during a job<br>(local disk on nodes if applicable)</div>Tom Langborg (NSC)https://snicdocs.nsc.liu.se/w/index.php?title=SNIC_storage&diff=5659SNIC storage2013-12-16T09:17:07Z<p>Tom Langborg (NSC): /* SNIC storage The Swedish Storage Initiative */</p>
<hr />
<div>=SNIC storage "The Swedish Storage Initiative"=<br />
<br />
Assignment<br />
Create a collaborative infrastructure for storage for Swedish research and Swedish university data<br />
with demonstrated/documented needs from the relevant communities.<br />
<br />
In scientific areas such as high energy physics (the Large Hadron Collider at CERN), climate modelling, bioinformatics, bioimaging etc., the demands for storage are increasing dramatically. To serve these and other user communities, SNIC has appointed a working group to design a storage strategy, taking into account the needs on many levels and creating a unified storage infrastructure, which is now being implemented.<br />
<br />
<br />
==National Storage "SweStore"==<br />
<br />
The aim of the nationally accessible storage is to build a robust, flexible and expandable system that can be used in most cases where access to large scale storage is needed. To the user it should appear as a single large system, while it is desirable that some parts of the system are distributed across all SNIC centres to benefit from the advantages of, among other things, locality and cache effects. The system is intended as a versatile long-term storage system.<br />
<br />
Documentation of the SweStore: [[Swestore]].<br />
<br />
== Center storage ==<br />
Centre storage, as defined by the SNIC storage group, is a storage solution that lives independently of the computational resources and can be accessed from all such resources at a centre. Key features include the ability to access the same filesystem the same way on all computational resources at a centre, and a unified structure and nomenclature for all centra. Unlike cluster storage which is tightly associated with a single cluster, and thus has a limited life-time, centre storage does not require the users to migrate their own data when clusters are decommissioned, not even when the storage hardware itself is being replaced.<br />
<br />
=== Unified environment ===<br />
To make the usage more transparent for SNIC users, a set of environment variables are available on all SNIC resources:<br />
<br />
* <code>SNIC_BACKUP</code> – the user's primary directory at the centre<br>(the part of the centre storage that is backed up)<br />
* <code>SNIC_NOBACKUP</code> – recommended directory for project storage without backup<br>(also on the centre storage)<br />
* <code>SNIC_TMP</code> – recommended directory for best performance during a job<br>(local disk on nodes if applicable)</div>Tom Langborg (NSC)https://snicdocs.nsc.liu.se/w/index.php?title=SNIC_storage&diff=5658SNIC storage2013-12-16T08:15:55Z<p>Tom Langborg (NSC): /* National Storage "SweStore" */</p>
<hr />
<div>=SNIC storage The Swedish Storage Initiative=<br />
<br />
In scientific areas such as high energy physics (the Large Hadron Collider at CERN), climate modelling, bioinformatics, bioimaging etc., the demands for storage are increasing dramatically. To serve these and other user communities, SNIC has appointed a working group to design a storage strategy, taking into account the needs on many levels and creating a unified storage infrastructure, which is now being implemented.<br />
<br />
<br />
==National Storage "SweStore"==<br />
<br />
The aim of the nationally accessible storage is to build a robust, flexible and expandable system that can be used in most cases where access to large scale storage is needed. To the user it should appear as a single large system, while it is desirable that some parts of the system are distributed across all SNIC centres to benefit from the advantages of, among other things, locality and cache effects. The system is intended as a versatile long-term storage system.<br />
<br />
Documentation of the SweStore: [[Swestore]].<br />
<br />
== Center storage ==<br />
Centre storage, as defined by the SNIC storage group, is a storage solution that lives independently of the computational resources and can be accessed from all such resources at a centre. Key features include the ability to access the same filesystem the same way on all computational resources at a centre, and a unified structure and nomenclature for all centra. Unlike cluster storage which is tightly associated with a single cluster, and thus has a limited life-time, centre storage does not require the users to migrate their own data when clusters are decommissioned, not even when the storage hardware itself is being replaced.<br />
<br />
=== Unified environment ===<br />
To make the usage more transparent for SNIC users, a set of environment variables are available on all SNIC resources:<br />
<br />
* <code>SNIC_BACKUP</code> – the user's primary directory at the centre<br>(the part of the centre storage that is backed up)<br />
* <code>SNIC_NOBACKUP</code> – recommended directory for project storage without backup<br>(also on the centre storage)<br />
* <code>SNIC_TMP</code> – recommended directory for best performance during a job<br>(local disk on nodes if applicable)</div>Tom Langborg (NSC)https://snicdocs.nsc.liu.se/w/index.php?title=SNIC_storage&diff=5657SNIC storage2013-12-16T08:01:55Z<p>Tom Langborg (NSC): </p>
<hr />
<div>=SNIC storage The Swedish Storage Initiative=<br />
<br />
In scientific areas such as high energy physics (the Large Hadron Collider at CERN), climate modelling, bioinformatics, bioimaging etc., the demands for storage are increasing dramatically. To serve these and other user communities, SNIC has appointed a working group to design a storage strategy, taking into account the needs on many levels and creating a unified storage infrastructure, which is now being implemented.<br />
<br />
<br />
==National Storage "SweStore"==<br />
<br />
The aim of the nationally accessible storage is to build a robust, flexible and expandable system that can be used in most cases where access to large scale storage is needed. To the user it should appear as a single large system, while it is desirable that some parts of the system are distributed across all SNIC centres to benefit from the advantages of, among other things, locality and cache effects. The system is intended as a versatile long-term storage system.<br />
<br />
== Center storage ==<br />
Centre storage, as defined by the SNIC storage group, is a storage solution that lives independently of the computational resources and can be accessed from all such resources at a centre. Key features include the ability to access the same filesystem the same way on all computational resources at a centre, and a unified structure and nomenclature for all centra. Unlike cluster storage which is tightly associated with a single cluster, and thus has a limited life-time, centre storage does not require the users to migrate their own data when clusters are decommissioned, not even when the storage hardware itself is being replaced.<br />
<br />
=== Unified environment ===<br />
To make the usage more transparent for SNIC users, a set of environment variables are available on all SNIC resources:<br />
<br />
* <code>SNIC_BACKUP</code> – the user's primary directory at the centre<br>(the part of the centre storage that is backed up)<br />
* <code>SNIC_NOBACKUP</code> – recommended directory for project storage without backup<br>(also on the centre storage)<br />
* <code>SNIC_TMP</code> – recommended directory for best performance during a job<br>(local disk on nodes if applicable)</div>Tom Langborg (NSC)https://snicdocs.nsc.liu.se/w/index.php?title=SNIC_storage&diff=5656SNIC storage2013-12-16T08:01:16Z<p>Tom Langborg (NSC): </p>
<hr />
<div>=SNIC storage The Swedish Storage Initiative=<br />
<br />
In scientific areas such as high energy physics (the Large Hadron Collider at CERN), climate modelling, bioinformatics, bioimaging etc., the demands for storage are increasing dramatically. To serve these and other user communities, SNIC has appointed a working group to design a storage strategy, taking into account the needs on many levels and creating a unified storage infrastructure, which is now being implemented.<br />
<br />
<br />
==National Storage "SweStore"==<br />
<br />
The aim of the nationally accessible storage is to build a robust, flexible and expandable system that can be used in most cases where access to large scale storage is needed. To the user it should appear as a single large system, while it is desirable that some parts of the system are distributed across all SNIC centres to benefit from the advantages of, among other things, locality and cache effects. The system is intended as a versatile long-term storage system.<br />
<br />
= Center storage =<br />
Centre storage, as defined by the SNIC storage group, is a storage solution that lives independently of the computational resources and can be accessed from all such resources at a centre. Key features include the ability to access the same filesystem the same way on all computational resources at a centre, and a unified structure and nomenclature for all centra. Unlike cluster storage which is tightly associated with a single cluster, and thus has a limited life-time, centre storage does not require the users to migrate their own data when clusters are decommissioned, not even when the storage hardware itself is being replaced.<br />
<br />
== Unified environment ==<br />
To make the usage more transparent for SNIC users, a set of environment variables are available on all SNIC resources:<br />
<br />
* <code>SNIC_BACKUP</code> – the user's primary directory at the centre<br>(the part of the centre storage that is backed up)<br />
* <code>SNIC_NOBACKUP</code> – recommended directory for project storage without backup<br>(also on the centre storage)<br />
* <code>SNIC_TMP</code> – recommended directory for best performance during a job<br>(local disk on nodes if applicable)</div>Tom Langborg (NSC)https://snicdocs.nsc.liu.se/w/index.php?title=SNIC_storage&diff=5655SNIC storage2013-12-16T07:57:00Z<p>Tom Langborg (NSC): </p>
<hr />
<div>SNIC storage The Swedish Storage Initiative<br />
<br />
In scientific areas such as high energy physics (the Large Hadron Collider at CERN), climate modelling, bioinformatics, bioimaging etc., the demands for storage are increasing dramatically. To serve these and other user communities, SNIC has appointed a working group to design a storage strategy, taking into account the needs on many levels and creating a unified storage infrastructure, which is now being implemented.<br />
<br />
<br />
<br />
Centre Storage<br />
<br />
Centre storage is a storage solution that lives independently of the computational resources and can be accessed from all such resources at a centre. Different centres should also have a common method for accessing the local centre storage, using common names for environment variables.<br />
<br />
Hardware for centre storage is in the process of being procured and installed at the SNIC centres.<br />
<br />
<br />
<br />
National Storage "SweStore"<br />
<br />
The aim of the nationally accessible storage is to build a robust, flexible and expandable system that can be used in most cases where access to large scale storage is needed. To the user it should appear as a single large system, while it is desirable that some parts of the system are distributed across all SNIC centres to benefit from the advantages of, among other things, locality and cache effects. The system is intended as a versatile long-term storage system.</div>Tom Langborg (NSC)https://snicdocs.nsc.liu.se/w/index.php?title=SNIC_storage&diff=5654SNIC storage2013-12-16T07:55:50Z<p>Tom Langborg (NSC): Created page with "SNIC storage The Swedish Storage Initiative In scientific areas such as high energy physics (the Large Hadron Collider at CERN), climate modelling, bioinformatics, bioimaging et..."</p>
<hr />
<div>SNIC storage The Swedish Storage Initiative<br />
<br />
In scientific areas such as high energy physics (the Large Hadron Collider at CERN), climate modelling, bioinformatics, bioimaging etc., the demands for storage are increasing dramatically. To serve these and other user communities, SNIC has appointed a working group to design a storage strategy, taking into account the needs on many levels and creating a unified storage infrastructure, which is now being implemented.<br />
<br />
<br />
<br />
Centre Storage<br />
<br />
Centre storage is a storage solution that lives independently of the computational resources and can be accessed from all such resources at a centre. Different centres should also have a common method for accessing the local centre storage, using common names for environment variables.<br />
<br />
Hardware for centre storage is in the process of being procured and installed at the SNIC centres.<br />
<br />
<br />
<br />
National Storage<br />
<br />
The aim of the nationally accessible storage is to build a robust, flexible and expandable system that can be used in most cases where access to large scale storage is needed. To the user it should appear as a single large system, while it is desirable that some parts of the system are distributed across all SNIC centres to benefit from the advantages of, among other things, locality and cache effects. The system is intended as a versatile long-term storage system.</div>Tom Langborg (NSC)https://snicdocs.nsc.liu.se/w/index.php?title=Swestore-irods&diff=5599Swestore-irods2013-11-12T12:17:27Z<p>Tom Langborg (NSC): /* Getting access */</p>
<hr />
<div>[[Category:Storage]]<br />
[[Category:SweStore]]<br />
<br />
'''This is not official yet'''<br />
<br />
SNIC is building a storage infrastructure to complement the computational resources.<br />
<br />
Many forms of automated measurements can produce large amounts of data. In scientific areas such as high energy physics (the Large Hadron Collider at CERN), climate modeling, bioinformatics, bioimaging etc., the demands for storage are increasing dramatically. To serve these and other user communities, SNIC has appointed a working group to design a storage strategy, taking into account the needs on many levels and creating a unified storage infrastructure, which is now being implemented.<br />
<br />
Swestore is in collaboration with [http://www.ecds.se/ ECDS], [http://snd.gu.se/ SND], Bioimage Sweden, [http://www.bils.se/ BILS], [http://www.uppnex.uu.se/ UPPNEX],[http://wlcg.web.cern.ch/ WLCG], [http://www.nrm.se/ NaturHistoriska RiksMuseet].<br />
<br />
= National storage =<br />
The Swestore Nationally Accessible Storage, commonly called just Swestore, is a robust, flexible and expandable long term storage system aimed at storing large amounts of data produced by various Swedish research projects. It is based on the [http://www.dcache.org dCache] and [http://www.irods.org iRODS]<br />
storage systems.<br />
<br />
Swestore is distributed across the SNIC centres [http://www.c3se.chalmers.se/ C3SE], [http://www.hpc2n.umu.se/ HPC2N], [http://www.lunarc.lu.se/ Lunarc], [http://www.nsc.liu.se/ NSC], [http://www.pdc.kth.se PDC] and [http://www.uppmax.uu.se Uppmax]. Data is stored in two copies with each copy at a different SNIC centre. This enables the system to cope with a multitude of issues ranging from a simple crash of a storage element to losing an entire site while still providing access to the stored data. <br />
<br />
One of the major advantages to the distributed nature of dCache and iRODS is the excellent aggregated transfer rates possible. This is achieved by bypassing a central node and having transfers going directly to/from the storage elements if the protocol allows it. The Swestore Nationally Accessible Storage system can achieve aggregated transfer rates in excess of 100 Gigabit per second, but in practice this is limited by connectivity to each University (usually 10 Gbit/s) or a limited number of files (typically<br />
max 1 Gbit/s per file/connection).<br />
<br />
To protect against silent data corruption the dCache storage system checksums all stored data and periodically verifies the data using this checksum.<br />
<br />
The dCache system does NOT yet provide protection against user errors like inadvertent file deletions and so on. The iRODS system provides this protection. Deleted files are moved to a trashcan.<br />
<br />
== Getting access ==<br />
; Apply for storage<br />
: Please follow the instructions on the [[Apply for storage on SweStore]] page.<br />
<br />
;Difference between dCache and iRODS user authentication<br />
:SweStore's dCache system uses eScience client certificates.<br />
:SweStore's iRODS system uses [http://www.yubico.com/products/yubikey-hardware/yubikey/ Yubikey] one-time passwords (OTP). With a simple touch of a button, a 44 character one-time password is generated and sent to the system.<br />
:Yubikey has a status as pilot now. It can be changed in the future.<br />
<br />
; dCache usage - How to acquire an eScience client certificate<br />
: Follow the instructions on [[Grid_certificates#Requesting_a_certificate|Requesting a certificate]] to get your client certificate. This step can be performed while waiting for the storage application to be approved and processed. Of course, if you already have a valid eScience certificate you don't need to acquire another one.<br />
:; For Terena certificates<br />
:: If intending to access SweStore from a SNIC resource, please make sure you also [[Exporting_a_client_certificate|export the certificate]], transfer it to the intended SNIC resource and [[Preparing_a_client_certificate|prepare it for use with grid tools]] (not necessarily needed with ARC 3.x, see [[Grid_certificates#Creating_a_proxy_certificate_using_the_Firefox.2FThunderbird_credential_store|proxy certificates using Firefox credential store]]).<br />
:; For Nordugrid certificates<br />
:: Please make sure to also [[Requesting_a_grid_certificate_from_the_Nordugrid_CA#Installing_the_certificate_in_your_browser|install your client certificate in your browser]].<br />
:; Request membership in the SweGrid VO<br />
:: Follow the instructions on [[Grid_certificates#Requesting_membership_in_the_SweGrid_VO|Requesting membership in the SweGrid VO]] to get added to the SweGrid Virtual Organisation (VO) and request membership to your allocated storage project.<br />
<br />
; iRODS usage - How to acquire a SweStore YubiKey<br />
<br />
Yubikey solution is still in a pilot phase. It can be changed in the future. <br><br />
To apply for a SweStore yubikey, please send an email to [mailto:support@swestore.se?subject=Yubikey support@swestore.se] and provide a shipping address to where the yubikey should be sent.<br />
<br />
== Support == <br />
<br />
If you have any issues using SweStore please do not hesitate to contact [mailto:support@swestore.se support@swestore.se].<br />
<br />
== dCache ==<br />
<br />
=== Access protocols ===<br />
; Currently supported protocols<br />
: GridFTP - gsiftp://gsiftp.swestore.se/<br />
: Storage Resource Manager - srm://srm.swegrid.se/<br />
: Hypertext Transfer Protocol (read-only), Web Distributed Authoring and Versioning - http://webdav.swestore.se/ (unauthenticated), https://webdav.swestore.se/<br />
: NFS4.1<br />
<br />
For authentication eScience certificates are used, which provides a higher level of security than legacy username/password schemes.<br />
<br />
=== Download and upload data ===<br />
; Interactive browsing and manipulation of single files<br />
: SweStore is accessible in your web browser in two ways, as a directory index interface at https://webdav.swestore.se/ and with an interactive file manager at https://webdav.swestore.se/browser/. '''Note''' that the interactive file manager has a lot of features and functions not supported in SweStore, only the basic file transfer features are supported.<br />
: To browse private data you need to have your certificate installed in your browser (default with Terena certificates, see above). Projects are organized under the <code>/snic</code> directory as <code><nowiki>https://webdav.swestore.se/snic/YOUR_PROJECT_NAME/</nowiki></code>.<br />
; Upload and delete data interactively or with automation<br />
There are several tools that are capable of using the protocols provided by SweStore national storage.<br />
For interactive usage on SNIC clusters we recommend using the ARC tools which should be installed on all SNIC resources.<br />
As an integration point for building scripts and automated systems we suggest using the curl program and library.<br />
: Use the ARC client. Please see the instructions for [[Accessing SweStore national storage with the ARC client]]. '''Recommended''' method when logged in on SNIC resources.<br />
: Use lftp. Please see the instructions for [[Accessing SweStore national storage with lftp]].<br />
: Use cURL. Please see the instructions for [[Accessing SweStore national storage with cURL]].<br />
: Use globus-url-copy. Please see the instructions for [[Accessing SweStore national storage with globus-url-copy]].<br />
<br />
=== Tools and scripts ===<br />
<br />
There exists a number of tools and utilities developed externally that can be useful. Here are some links:<br />
<br />
* [https://github.com/samuell/arc_tools ARC_Tools] - Convenience scripts for the arc client (Only a recursive rmdir so far).<br />
* [http://sourceforge.net/projects/arc-gui-clients ARC Graphical Clients] - Contains the ARC Storage Explorer (SweStore supported development).<br />
* Transfer script, [[SweStore/swetrans_arc|swetrans_arc]], provided by Adam Peplinski / Philipp Schlatter<br />
* [http://www.nordugrid.org/documents/SWIG-wrapped-ARC-Python-API.pdf Documentation of the ARC Python API (PDF)]<br />
<br />
=== Slides and more ===<br />
<br />
[http://docs.snic.se/wiki/Swestore/Lund_Seminar_Apr18 Slides and material from seminar for Lund users on April 18th]<br />
<br />
=== Usage monitoring ===<br />
* [http://status.swestore.se/munin/monitor/monitor/ Per Project Monitoring of Swestore usage]<br />
<br />
== iRODS ==<br />
<br />
There is a SNIC iRODS system available under Swestore.<br />
<br />
=== Supported clients ===<br />
<br />
: iDrop web - Point your Web browser to [https://iweb.swestore.se iweb.swestore.se]<br />
: E-iRODS iCommands - Command line client [http://eirods.org/download/ Download E-iRODS icommands]<br />
<br />
SweStore iRODS uses PAM authentication and SweStore yubikeys. With a simple touch of a button, a 44 character one-time password is generated and sent to the system.<br />
<br />
The community iRODS client also should work, with PAM authentication e.g.<br />
the following changes to the Makefile iRODS/config/config.mk and a recompile:<br />
<pre><br />
PAM_AUTH = 1<br />
PAM_AUTH_NO_EXTEND = 1<br />
USE_SSL = 1 <br />
</pre><br />
<br />
=== SweStore iRODS usage documentation ===<br />
<br />
To use the system you need to have the E-iRODS command line client installed or using iDROP web. <br />
<br />
==== Command line client ====<br />
<br />
For Linux systems the iRODS commandline client is available as an installable package for various<br />
Linux platforms from the e-iRODS website downloads section.<br />
<br />
The command line client is natural to use for Unix users.<br />
There are versions of the usual ls, rm, mv, mkdir, pwd, rsync<br />
commands prefixed with an i for iRODS, i.e. irm, imv, imkdir etc.<br />
<br />
As expected iput and iget move files to and from the irods system.<br />
All these commands print short help when using the -h option.<br />
<br />
===== iCommands environment file =====<br />
<br />
There is an environment file .irodsEnv in the .irods subdirectory<br />
of the home directory ($HOME/.irods/.irodsEnv) which contains information where and how<br />
to access the iRODS metadata (iCAT) server.<br />
<br />
It looks like (placeholders are in <>):<br />
<pre><br />
irodsHost 'irods.swestore.se'<br />
irodsPort 1247<br />
irodsDefResource 'snicdefResc'<br />
irodsHome '/snicZone/home/<email address>'<br />
irodsCwd '/snicZone/home/<email address>'<br />
irodsUserName '<email address>'<br />
irodsZone 'snicZone'<br />
irodsAuthScheme 'PAM'<br />
</pre><br />
<br />
The iCAT server is irods.swestore.se.<br />
The default irods zone name is snicZone.<br />
The default resource is snicdefResc.<br />
<br />
With the corrent environment file all we need is a Yubikey and we can run the iinit command to authenticate to the iCAT server. After that we can use the usual iCommands for 8 hours.<br />
<br />
More details on the iCommands are available at<br />
https://www.irods.org/index.php/icommands<br />
<br />
===== Using iCommands on SNIC HPC clusters =====<br />
<br />
On SNIC-clusters the icommands command line tools are either available in the PATH or by adding the irods module, e.g.<br />
: module load irods<br />
We also need to setup the iCommands environment file $HOME/.irods/.irodsEnv<br />
<br />
==== iDROP web client ====<br />
<br />
The web client is accessible via the URL https://iweb.swestore.se/.<br />
A login screen will be presented first and your Yubikey should<br />
be used to log in.<br />
<br />
==== Upstream documentation ====<br />
Detailed documentation, papers and resources are available from<br />
the [http://www.eirods.org E-iRODS web site]<br />
<br />
[http://www.irods.org Community iRODS]<br />
<br />
[https://groups.google.com/d/forum/irod-chat User forum]<br />
<br />
= Center storage =<br />
Centre storage, as defined by the SNIC storage group, is a storage solution that lives independently of the computational resources and can be accessed from all such resources at a centre. Key features include the ability to access the same filesystem the same way on all computational resources at a centre, and a unified structure and nomenclature for all centra. Unlike cluster storage which is tightly associated with a single cluster, and thus has a limited life-time, centre storage does not require the users to migrate their own data when clusters are decommissioned, not even when the storage hardware itself is being replaced.<br />
<br />
== Unified environment ==<br />
To make the usage more transparent for SNIC users, a set of environment variables are available on all SNIC resources:<br />
<br />
* <code>SNIC_BACKUP</code> – the user's primary directory at the centre<br>(the part of the centre storage that is backed up)<br />
* <code>SNIC_NOBACKUP</code> – recommended directory for project storage without backup<br>(also on the centre storage)<br />
* <code>SNIC_TMP</code> – recommended directory for best performance during a job<br>(local disk on nodes if applicable)</div>Tom Langborg (NSC)https://snicdocs.nsc.liu.se/w/index.php?title=Swestore-irods&diff=5598Swestore-irods2013-11-12T12:12:20Z<p>Tom Langborg (NSC): /* Getting access */</p>
<hr />
<div>[[Category:Storage]]<br />
[[Category:SweStore]]<br />
<br />
'''This is not official yet'''<br />
<br />
SNIC is building a storage infrastructure to complement the computational resources.<br />
<br />
Many forms of automated measurements can produce large amounts of data. In scientific areas such as high energy physics (the Large Hadron Collider at CERN), climate modeling, bioinformatics, bioimaging etc., the demands for storage are increasing dramatically. To serve these and other user communities, SNIC has appointed a working group to design a storage strategy, taking into account the needs on many levels and creating a unified storage infrastructure, which is now being implemented.<br />
<br />
Swestore is in collaboration with [http://www.ecds.se/ ECDS], [http://snd.gu.se/ SND], Bioimage Sweden, [http://www.bils.se/ BILS], [http://www.uppnex.uu.se/ UPPNEX],[http://wlcg.web.cern.ch/ WLCG], [http://www.nrm.se/ NaturHistoriska RiksMuseet].<br />
<br />
= National storage =<br />
The Swestore Nationally Accessible Storage, commonly called just Swestore, is a robust, flexible and expandable long term storage system aimed at storing large amounts of data produced by various Swedish research projects. It is based on the [http://www.dcache.org dCache] and [http://www.irods.org iRODS]<br />
storage systems.<br />
<br />
Swestore is distributed across the SNIC centres [http://www.c3se.chalmers.se/ C3SE], [http://www.hpc2n.umu.se/ HPC2N], [http://www.lunarc.lu.se/ Lunarc], [http://www.nsc.liu.se/ NSC], [http://www.pdc.kth.se PDC] and [http://www.uppmax.uu.se Uppmax]. Data is stored in two copies with each copy at a different SNIC centre. This enables the system to cope with a multitude of issues ranging from a simple crash of a storage element to losing an entire site while still providing access to the stored data. <br />
<br />
One of the major advantages to the distributed nature of dCache and iRODS is the excellent aggregated transfer rates possible. This is achieved by bypassing a central node and having transfers going directly to/from the storage elements if the protocol allows it. The Swestore Nationally Accessible Storage system can achieve aggregated transfer rates in excess of 100 Gigabit per second, but in practice this is limited by connectivity to each University (usually 10 Gbit/s) or a limited number of files (typically<br />
max 1 Gbit/s per file/connection).<br />
<br />
To protect against silent data corruption the dCache storage system checksums all stored data and periodically verifies the data using this checksum.<br />
<br />
The dCache system does NOT yet provide protection against user errors like inadvertent file deletions and so on. The iRODS system provides this protection. Deleted files are moved to a trashcan.<br />
<br />
== Getting access ==<br />
; Apply for storage<br />
: Please follow the instructions on the [[Apply for storage on SweStore]] page.<br />
<br />
;Difference between dCache and iRODS user authentication<br />
:SweStore's dCache system uses eScience client certificates.<br />
:SweStore's iRODS system uses [http://www.yubico.com/products/yubikey-hardware/yubikey/ Yubikey] one-time passwords (OTP). With a simple touch of a button, a 44 character one-time password is generated and sent to the system.<br />
:Yubikey has a status as pilot now. It can be changed to some thing other in the future.<br />
<br />
; dCache usage - How to acquire an eScience client certificate<br />
: Follow the instructions on [[Grid_certificates#Requesting_a_certificate|Requesting a certificate]] to get your client certificate. This step can be performed while waiting for the storage application to be approved and processed. Of course, if you already have a valid eScience certificate you don't need to acquire another one.<br />
:; For Terena certificates<br />
:: If intending to access SweStore from a SNIC resource, please make sure you also [[Exporting_a_client_certificate|export the certificate]], transfer it to the intended SNIC resource and [[Preparing_a_client_certificate|prepare it for use with grid tools]] (not necessarily needed with ARC 3.x, see [[Grid_certificates#Creating_a_proxy_certificate_using_the_Firefox.2FThunderbird_credential_store|proxy certificates using Firefox credential store]]).<br />
:; For Nordugrid certificates<br />
:: Please make sure to also [[Requesting_a_grid_certificate_from_the_Nordugrid_CA#Installing_the_certificate_in_your_browser|install your client certificate in your browser]].<br />
:; Request membership in the SweGrid VO<br />
:: Follow the instructions on [[Grid_certificates#Requesting_membership_in_the_SweGrid_VO|Requesting membership in the SweGrid VO]] to get added to the SweGrid Virtual Organisation (VO) and request membership to your allocated storage project.<br />
<br />
; iRODS usage - How to acquire a SweStore YubiKey<br />
<br />
To apply for a SweStore yubikey, please send an email to [mailto:support@swestore.se?subject=Yubikey support@swestore.se] and provide a shipping address to where the yubikey should be sent.<br />
<br />
== Support == <br />
<br />
If you have any issues using SweStore please do not hesitate to contact [mailto:support@swestore.se support@swestore.se].<br />
<br />
== dCache ==<br />
<br />
=== Access protocols ===<br />
; Currently supported protocols<br />
: GridFTP - gsiftp://gsiftp.swestore.se/<br />
: Storage Resource Manager - srm://srm.swegrid.se/<br />
: Hypertext Transfer Protocol (read-only), Web Distributed Authoring and Versioning - http://webdav.swestore.se/ (unauthenticated), https://webdav.swestore.se/<br />
: NFS4.1<br />
<br />
For authentication eScience certificates are used, which provides a higher level of security than legacy username/password schemes.<br />
<br />
=== Download and upload data ===<br />
; Interactive browsing and manipulation of single files<br />
: SweStore is accessible in your web browser in two ways, as a directory index interface at https://webdav.swestore.se/ and with an interactive file manager at https://webdav.swestore.se/browser/. '''Note''' that the interactive file manager has a lot of features and functions not supported in SweStore, only the basic file transfer features are supported.<br />
: To browse private data you need to have your certificate installed in your browser (default with Terena certificates, see above). Projects are organized under the <code>/snic</code> directory as <code><nowiki>https://webdav.swestore.se/snic/YOUR_PROJECT_NAME/</nowiki></code>.<br />
; Upload and delete data interactively or with automation<br />
There are several tools that are capable of using the protocols provided by SweStore national storage.<br />
For interactive usage on SNIC clusters we recommend using the ARC tools which should be installed on all SNIC resources.<br />
As an integration point for building scripts and automated systems we suggest using the curl program and library.<br />
: Use the ARC client. Please see the instructions for [[Accessing SweStore national storage with the ARC client]]. '''Recommended''' method when logged in on SNIC resources.<br />
: Use lftp. Please see the instructions for [[Accessing SweStore national storage with lftp]].<br />
: Use cURL. Please see the instructions for [[Accessing SweStore national storage with cURL]].<br />
: Use globus-url-copy. Please see the instructions for [[Accessing SweStore national storage with globus-url-copy]].<br />
<br />
=== Tools and scripts ===<br />
<br />
There exists a number of tools and utilities developed externally that can be useful. Here are some links:<br />
<br />
* [https://github.com/samuell/arc_tools ARC_Tools] - Convenience scripts for the arc client (Only a recursive rmdir so far).<br />
* [http://sourceforge.net/projects/arc-gui-clients ARC Graphical Clients] - Contains the ARC Storage Explorer (SweStore supported development).<br />
* Transfer script, [[SweStore/swetrans_arc|swetrans_arc]], provided by Adam Peplinski / Philipp Schlatter<br />
* [http://www.nordugrid.org/documents/SWIG-wrapped-ARC-Python-API.pdf Documentation of the ARC Python API (PDF)]<br />
<br />
=== Slides and more ===<br />
<br />
[http://docs.snic.se/wiki/Swestore/Lund_Seminar_Apr18 Slides and material from seminar for Lund users on April 18th]<br />
<br />
=== Usage monitoring ===<br />
* [http://status.swestore.se/munin/monitor/monitor/ Per Project Monitoring of Swestore usage]<br />
<br />
== iRODS ==<br />
<br />
There is a SNIC iRODS system available under Swestore.<br />
<br />
=== Supported clients ===<br />
<br />
: iDrop web - Point your Web browser to [https://iweb.swestore.se iweb.swestore.se]<br />
: E-iRODS iCommands - Command line client [http://eirods.org/download/ Download E-iRODS icommands]<br />
<br />
SweStore iRODS uses PAM authentication and SweStore yubikeys. With a simple touch of a button, a 44 character one-time password is generated and sent to the system.<br />
<br />
The community iRODS client also should work, with PAM authentication e.g.<br />
the following changes to the Makefile iRODS/config/config.mk and a recompile:<br />
<pre><br />
PAM_AUTH = 1<br />
PAM_AUTH_NO_EXTEND = 1<br />
USE_SSL = 1 <br />
</pre><br />
<br />
=== SweStore iRODS usage documentation ===<br />
<br />
To use the system you need to have the E-iRODS command line client installed or using iDROP web. <br />
<br />
==== Command line client ====<br />
<br />
For Linux systems the iRODS commandline client is available as an installable package for various<br />
Linux platforms from the e-iRODS website downloads section.<br />
<br />
The command line client is natural to use for Unix users.<br />
There are versions of the usual ls, rm, mv, mkdir, pwd, rsync<br />
commands prefixed with an i for iRODS, i.e. irm, imv, imkdir etc.<br />
<br />
As expected iput and iget move files to and from the irods system.<br />
All these commands print short help when using the -h option.<br />
<br />
===== iCommands environment file =====<br />
<br />
There is an environment file .irodsEnv in the .irods subdirectory<br />
of the home directory ($HOME/.irods/.irodsEnv) which contains information where and how<br />
to access the iRODS metadata (iCAT) server.<br />
<br />
It looks like (placeholders are in <>):<br />
<pre><br />
irodsHost 'irods.swestore.se'<br />
irodsPort 1247<br />
irodsDefResource 'snicdefResc'<br />
irodsHome '/snicZone/home/<email address>'<br />
irodsCwd '/snicZone/home/<email address>'<br />
irodsUserName '<email address>'<br />
irodsZone 'snicZone'<br />
irodsAuthScheme 'PAM'<br />
</pre><br />
<br />
The iCAT server is irods.swestore.se.<br />
The default irods zone name is snicZone.<br />
The default resource is snicdefResc.<br />
<br />
With the corrent environment file all we need is a Yubikey and we can run the iinit command to authenticate to the iCAT server. After that we can use the usual iCommands for 8 hours.<br />
<br />
More details on the iCommands are available at<br />
https://www.irods.org/index.php/icommands<br />
<br />
===== Using iCommands on SNIC HPC clusters =====<br />
<br />
On SNIC-clusters the icommands command line tools are either available in the PATH or by adding the irods module, e.g.<br />
: module load irods<br />
We also need to setup the iCommands environment file $HOME/.irods/.irodsEnv<br />
<br />
==== iDROP web client ====<br />
<br />
The web client is accessible via the URL https://iweb.swestore.se/.<br />
A login screen will be presented first and your Yubikey should<br />
be used to log in.<br />
<br />
==== Upstream documentation ====<br />
Detailed documentation, papers and resources are available from<br />
the [http://www.eirods.org E-iRODS web site]<br />
<br />
[http://www.irods.org Community iRODS]<br />
<br />
[https://groups.google.com/d/forum/irod-chat User forum]<br />
<br />
= Center storage =<br />
Centre storage, as defined by the SNIC storage group, is a storage solution that lives independently of the computational resources and can be accessed from all such resources at a centre. Key features include the ability to access the same filesystem the same way on all computational resources at a centre, and a unified structure and nomenclature for all centra. Unlike cluster storage which is tightly associated with a single cluster, and thus has a limited life-time, centre storage does not require the users to migrate their own data when clusters are decommissioned, not even when the storage hardware itself is being replaced.<br />
<br />
== Unified environment ==<br />
To make the usage more transparent for SNIC users, a set of environment variables are available on all SNIC resources:<br />
<br />
* <code>SNIC_BACKUP</code> – the user's primary directory at the centre<br>(the part of the centre storage that is backed up)<br />
* <code>SNIC_NOBACKUP</code> – recommended directory for project storage without backup<br>(also on the centre storage)<br />
* <code>SNIC_TMP</code> – recommended directory for best performance during a job<br>(local disk on nodes if applicable)</div>Tom Langborg (NSC)https://snicdocs.nsc.liu.se/w/index.php?title=Swestore-irods&diff=5597Swestore-irods2013-11-12T12:11:39Z<p>Tom Langborg (NSC): /* Getting access */</p>
<hr />
<div>[[Category:Storage]]<br />
[[Category:SweStore]]<br />
<br />
'''This is not official yet'''<br />
<br />
SNIC is building a storage infrastructure to complement the computational resources.<br />
<br />
Many forms of automated measurements can produce large amounts of data. In scientific areas such as high energy physics (the Large Hadron Collider at CERN), climate modeling, bioinformatics, bioimaging etc., the demands for storage are increasing dramatically. To serve these and other user communities, SNIC has appointed a working group to design a storage strategy, taking into account the needs on many levels and creating a unified storage infrastructure, which is now being implemented.<br />
<br />
Swestore is in collaboration with [http://www.ecds.se/ ECDS], [http://snd.gu.se/ SND], Bioimage Sweden, [http://www.bils.se/ BILS], [http://www.uppnex.uu.se/ UPPNEX],[http://wlcg.web.cern.ch/ WLCG], [http://www.nrm.se/ NaturHistoriska RiksMuseet].<br />
<br />
= National storage =<br />
The Swestore Nationally Accessible Storage, commonly called just Swestore, is a robust, flexible and expandable long term storage system aimed at storing large amounts of data produced by various Swedish research projects. It is based on the [http://www.dcache.org dCache] and [http://www.irods.org iRODS]<br />
storage systems.<br />
<br />
Swestore is distributed across the SNIC centres [http://www.c3se.chalmers.se/ C3SE], [http://www.hpc2n.umu.se/ HPC2N], [http://www.lunarc.lu.se/ Lunarc], [http://www.nsc.liu.se/ NSC], [http://www.pdc.kth.se PDC] and [http://www.uppmax.uu.se Uppmax]. Data is stored in two copies with each copy at a different SNIC centre. This enables the system to cope with a multitude of issues ranging from a simple crash of a storage element to losing an entire site while still providing access to the stored data. <br />
<br />
One of the major advantages to the distributed nature of dCache and iRODS is the excellent aggregated transfer rates possible. This is achieved by bypassing a central node and having transfers going directly to/from the storage elements if the protocol allows it. The Swestore Nationally Accessible Storage system can achieve aggregated transfer rates in excess of 100 Gigabit per second, but in practice this is limited by connectivity to each University (usually 10 Gbit/s) or a limited number of files (typically<br />
max 1 Gbit/s per file/connection).<br />
<br />
To protect against silent data corruption the dCache storage system checksums all stored data and periodically verifies the data using this checksum.<br />
<br />
The dCache system does NOT yet provide protection against user errors like inadvertent file deletions and so on. The iRODS system provides this protection. Deleted files are moved to a trashcan.<br />
<br />
== Getting access ==<br />
; Apply for storage<br />
: Please follow the instructions on the [[Apply for storage on SweStore]] page.<br />
<br />
;Difference between dCache and iRODS user authentication<br />
:SweStore's dCache system uses eScience client certificates.<br />
:SweStore's iRODS system uses [http://www.yubico.com/products/yubikey-hardware/yubikey/ Yubikey] one-time passwords (OTP). With a simple touch of a button, a 44 character one-time password is generated and sent to the system. <br><br />
Yubikey has a status as pilot now. It can be changed to some thing other in the future.<br />
<br />
; dCache usage - How to acquire an eScience client certificate<br />
: Follow the instructions on [[Grid_certificates#Requesting_a_certificate|Requesting a certificate]] to get your client certificate. This step can be performed while waiting for the storage application to be approved and processed. Of course, if you already have a valid eScience certificate you don't need to acquire another one.<br />
:; For Terena certificates<br />
:: If intending to access SweStore from a SNIC resource, please make sure you also [[Exporting_a_client_certificate|export the certificate]], transfer it to the intended SNIC resource and [[Preparing_a_client_certificate|prepare it for use with grid tools]] (not necessarily needed with ARC 3.x, see [[Grid_certificates#Creating_a_proxy_certificate_using_the_Firefox.2FThunderbird_credential_store|proxy certificates using Firefox credential store]]).<br />
:; For Nordugrid certificates<br />
:: Please make sure to also [[Requesting_a_grid_certificate_from_the_Nordugrid_CA#Installing_the_certificate_in_your_browser|install your client certificate in your browser]].<br />
:; Request membership in the SweGrid VO<br />
:: Follow the instructions on [[Grid_certificates#Requesting_membership_in_the_SweGrid_VO|Requesting membership in the SweGrid VO]] to get added to the SweGrid Virtual Organisation (VO) and request membership to your allocated storage project.<br />
<br />
; iRODS usage - How to acquire a SweStore YubiKey<br />
<br />
To apply for a SweStore yubikey, please send an email to [mailto:support@swestore.se?subject=Yubikey support@swestore.se] and provide a shipping address to where the yubikey should be sent.<br />
<br />
== Support == <br />
<br />
If you have any issues using SweStore please do not hesitate to contact [mailto:support@swestore.se support@swestore.se].<br />
<br />
== dCache ==<br />
<br />
=== Access protocols ===<br />
; Currently supported protocols<br />
: GridFTP - gsiftp://gsiftp.swestore.se/<br />
: Storage Resource Manager - srm://srm.swegrid.se/<br />
: Hypertext Transfer Protocol (read-only), Web Distributed Authoring and Versioning - http://webdav.swestore.se/ (unauthenticated), https://webdav.swestore.se/<br />
: NFS4.1<br />
<br />
For authentication eScience certificates are used, which provides a higher level of security than legacy username/password schemes.<br />
<br />
=== Download and upload data ===<br />
; Interactive browsing and manipulation of single files<br />
: SweStore is accessible in your web browser in two ways, as a directory index interface at https://webdav.swestore.se/ and with an interactive file manager at https://webdav.swestore.se/browser/. '''Note''' that the interactive file manager has a lot of features and functions not supported in SweStore, only the basic file transfer features are supported.<br />
: To browse private data you need to have your certificate installed in your browser (default with Terena certificates, see above). Projects are organized under the <code>/snic</code> directory as <code><nowiki>https://webdav.swestore.se/snic/YOUR_PROJECT_NAME/</nowiki></code>.<br />
; Upload and delete data interactively or with automation<br />
There are several tools that are capable of using the protocols provided by SweStore national storage.<br />
For interactive usage on SNIC clusters we recommend using the ARC tools which should be installed on all SNIC resources.<br />
As an integration point for building scripts and automated systems we suggest using the curl program and library.<br />
: Use the ARC client. Please see the instructions for [[Accessing SweStore national storage with the ARC client]]. '''Recommended''' method when logged in on SNIC resources.<br />
: Use lftp. Please see the instructions for [[Accessing SweStore national storage with lftp]].<br />
: Use cURL. Please see the instructions for [[Accessing SweStore national storage with cURL]].<br />
: Use globus-url-copy. Please see the instructions for [[Accessing SweStore national storage with globus-url-copy]].<br />
<br />
=== Tools and scripts ===<br />
<br />
There exists a number of tools and utilities developed externally that can be useful. Here are some links:<br />
<br />
* [https://github.com/samuell/arc_tools ARC_Tools] - Convenience scripts for the arc client (Only a recursive rmdir so far).<br />
* [http://sourceforge.net/projects/arc-gui-clients ARC Graphical Clients] - Contains the ARC Storage Explorer (SweStore supported development).<br />
* Transfer script, [[SweStore/swetrans_arc|swetrans_arc]], provided by Adam Peplinski / Philipp Schlatter<br />
* [http://www.nordugrid.org/documents/SWIG-wrapped-ARC-Python-API.pdf Documentation of the ARC Python API (PDF)]<br />
<br />
=== Slides and more ===<br />
<br />
[http://docs.snic.se/wiki/Swestore/Lund_Seminar_Apr18 Slides and material from seminar for Lund users on April 18th]<br />
<br />
=== Usage monitoring ===<br />
* [http://status.swestore.se/munin/monitor/monitor/ Per Project Monitoring of Swestore usage]<br />
<br />
== iRODS ==<br />
<br />
There is a SNIC iRODS system available under Swestore.<br />
<br />
=== Supported clients ===<br />
<br />
: iDrop web - Point your Web browser to [https://iweb.swestore.se iweb.swestore.se]<br />
: E-iRODS iCommands - Command line client [http://eirods.org/download/ Download E-iRODS icommands]<br />
<br />
SweStore iRODS uses PAM authentication and SweStore yubikeys. With a simple touch of a button, a 44 character one-time password is generated and sent to the system.<br />
<br />
The community iRODS client also should work, with PAM authentication e.g.<br />
the following changes to the Makefile iRODS/config/config.mk and a recompile:<br />
<pre><br />
PAM_AUTH = 1<br />
PAM_AUTH_NO_EXTEND = 1<br />
USE_SSL = 1 <br />
</pre><br />
<br />
=== SweStore iRODS usage documentation ===<br />
<br />
To use the system you need to have the E-iRODS command line client installed or using iDROP web. <br />
<br />
==== Command line client ====<br />
<br />
For Linux systems the iRODS commandline client is available as an installable package for various<br />
Linux platforms from the e-iRODS website downloads section.<br />
<br />
The command line client is natural to use for Unix users.<br />
There are versions of the usual ls, rm, mv, mkdir, pwd, rsync<br />
commands prefixed with an i for iRODS, i.e. irm, imv, imkdir etc.<br />
<br />
As expected iput and iget move files to and from the irods system.<br />
All these commands print short help when using the -h option.<br />
<br />
===== iCommands environment file =====<br />
<br />
There is an environment file .irodsEnv in the .irods subdirectory<br />
of the home directory ($HOME/.irods/.irodsEnv) which contains information where and how<br />
to access the iRODS metadata (iCAT) server.<br />
<br />
It looks like (placeholders are in <>):<br />
<pre><br />
irodsHost 'irods.swestore.se'<br />
irodsPort 1247<br />
irodsDefResource 'snicdefResc'<br />
irodsHome '/snicZone/home/<email address>'<br />
irodsCwd '/snicZone/home/<email address>'<br />
irodsUserName '<email address>'<br />
irodsZone 'snicZone'<br />
irodsAuthScheme 'PAM'<br />
</pre><br />
<br />
The iCAT server is irods.swestore.se.<br />
The default irods zone name is snicZone.<br />
The default resource is snicdefResc.<br />
<br />
With the corrent environment file all we need is a Yubikey and we can run the iinit command to authenticate to the iCAT server. After that we can use the usual iCommands for 8 hours.<br />
<br />
More details on the iCommands are available at<br />
https://www.irods.org/index.php/icommands<br />
<br />
===== Using iCommands on SNIC HPC clusters =====<br />
<br />
On SNIC-clusters the icommands command line tools are either available in the PATH or by adding the irods module, e.g.<br />
: module load irods<br />
We also need to setup the iCommands environment file $HOME/.irods/.irodsEnv<br />
<br />
==== iDROP web client ====<br />
<br />
The web client is accessible via the URL https://iweb.swestore.se/.<br />
A login screen will be presented first and your Yubikey should<br />
be used to log in.<br />
<br />
==== Upstream documentation ====<br />
Detailed documentation, papers and resources are available from<br />
the [http://www.eirods.org E-iRODS web site]<br />
<br />
[http://www.irods.org Community iRODS]<br />
<br />
[https://groups.google.com/d/forum/irod-chat User forum]<br />
<br />
= Center storage =<br />
Centre storage, as defined by the SNIC storage group, is a storage solution that lives independently of the computational resources and can be accessed from all such resources at a centre. Key features include the ability to access the same filesystem the same way on all computational resources at a centre, and a unified structure and nomenclature for all centra. Unlike cluster storage which is tightly associated with a single cluster, and thus has a limited life-time, centre storage does not require the users to migrate their own data when clusters are decommissioned, not even when the storage hardware itself is being replaced.<br />
<br />
== Unified environment ==<br />
To make the usage more transparent for SNIC users, a set of environment variables are available on all SNIC resources:<br />
<br />
* <code>SNIC_BACKUP</code> – the user's primary directory at the centre<br>(the part of the centre storage that is backed up)<br />
* <code>SNIC_NOBACKUP</code> – recommended directory for project storage without backup<br>(also on the centre storage)<br />
* <code>SNIC_TMP</code> – recommended directory for best performance during a job<br>(local disk on nodes if applicable)</div>Tom Langborg (NSC)https://snicdocs.nsc.liu.se/w/index.php?title=SweStore/iRODS_icommand&diff=5525SweStore/iRODS icommand2013-10-31T11:59:34Z<p>Tom Langborg (NSC): </p>
<hr />
<div>For accessing the Swestore national storage, we use (since Nov 2013 iRODS, and more specifically the iRODS client called "icommands", which is a set of commands similar to the unix commands "ls", "cd", "imkdir" etc, but with an "i" in front, plus two FTP-like commands: "iput" and "iget" (plus some more, but those are the ones you need most).<br />
<br />
Note that data put on Swestore via iRODS is not accessible through other means.<br />
<br />
Activate the iRODS icommands<br />
The iRODS icommands client is activated through the module system, as so many other things on SNIC clusters.<br />
<br />
1. Log in to SNIC cluster<br><br />
2. Execute:<br />
<br />
module load irods <br><br />
iinit <br><br />
After activating the iRODS icommands, you will be placed in one of your projects. if you run "ils", you will then see a listing of files and folders in that project, something like this:<br />
<br />
[samuel@kalkyl4 ~]$ ils<br />
/ssUppnexZone/proj/b2011221:<br />
C- /ssUppnexZone/proj/b2011221/firstRun<br />
Navigate around<br />
To enter one of the (or the only) folder(s), do:<br />
<br />
icd [proj-id]<br />
... in this case:<br />
<br />
icd firstrun<br />
To switch to another project, use<br />
<br />
icd ..<br />
to back up a level. After that, you can change to another project or folder using icd as previously<br />
<br />
icd b2011222<br />
Upload files<br />
To upload the above mentioned folder, do (-r is needed to recurse into directories):<br />
<br />
iput -r [a local folder]<br />
For single files, the -r flag is not needed:<br />
<br />
iput [a local file]<br />
To list the newly uploaded file/directory:<br />
<br />
ils<br />
To create folders, do:<br />
<br />
imkdir [folder-name]<br />
If you want to verify the upload outside of iRODS, there's a utility called ssverify.sh that will let you do that.<br />
<br />
Downloading files<br />
To download a file again, do:<br />
<br />
iget [a file in iRODS]<br />
... or, for folders, do it recursively:<br />
<br />
iget -r [a folder in iRODS]<br />
File removal<br />
To remove a file, you'll like need to use -f to irm in order to bypass the use of a trash area, which is currently not supported at Uppmax.<br />
<br />
More hints<br />
You can have iput show the progress with the -P flag<br />
If you want to be placed in a particular directory when you load the irods module, you can create a file called<br />
$HOME/.irods/.irodsEnv<br />
containing<br />
irodsHome '/ssUppnexZone/proj/myproj'<br />
irodsCwd '/ssUppnexZone/proj/myproj'<br />
where<br />
myproj<br />
is replaced by your project id.<br />
Read more info about the respective i-commands with "[command] -h"</div>Tom Langborg (NSC)https://snicdocs.nsc.liu.se/w/index.php?title=SweStore/iRODS_icommand&diff=5524SweStore/iRODS icommand2013-10-31T11:59:15Z<p>Tom Langborg (NSC): </p>
<hr />
<div>For accessing the Swestore national storage, we use (since Nov 2013 iRODS, and more specifically the iRODS client called "icommands", which is a set of commands similar to the unix commands "ls", "cd", "imkdir" etc, but with an "i" in front, plus two FTP-like commands: "iput" and "iget" (plus some more, but those are the ones you need most).<br />
<br />
Note that data put on Swestore via iRODS is not accessible through other means.<br />
<br />
Activate the iRODS icommands<br />
The iRODS icommands client is activated through the module system, as so many other things on SNIC clusters.<br />
<br />
1. Log in to SNIC cluster<br><br />
2. Execute:<br />
<br />
module load irods <br><br />
iinit <br />
After activating the iRODS icommands, you will be placed in one of your projects. if you run "ils", you will then see a listing of files and folders in that project, something like this:<br />
<br />
[samuel@kalkyl4 ~]$ ils<br />
/ssUppnexZone/proj/b2011221:<br />
C- /ssUppnexZone/proj/b2011221/firstRun<br />
Navigate around<br />
To enter one of the (or the only) folder(s), do:<br />
<br />
icd [proj-id]<br />
... in this case:<br />
<br />
icd firstrun<br />
To switch to another project, use<br />
<br />
icd ..<br />
to back up a level. After that, you can change to another project or folder using icd as previously<br />
<br />
icd b2011222<br />
Upload files<br />
To upload the above mentioned folder, do (-r is needed to recurse into directories):<br />
<br />
iput -r [a local folder]<br />
For single files, the -r flag is not needed:<br />
<br />
iput [a local file]<br />
To list the newly uploaded file/directory:<br />
<br />
ils<br />
To create folders, do:<br />
<br />
imkdir [folder-name]<br />
If you want to verify the upload outside of iRODS, there's a utility called ssverify.sh that will let you do that.<br />
<br />
Downloading files<br />
To download a file again, do:<br />
<br />
iget [a file in iRODS]<br />
... or, for folders, do it recursively:<br />
<br />
iget -r [a folder in iRODS]<br />
File removal<br />
To remove a file, you'll like need to use -f to irm in order to bypass the use of a trash area, which is currently not supported at Uppmax.<br />
<br />
More hints<br />
You can have iput show the progress with the -P flag<br />
If you want to be placed in a particular directory when you load the irods module, you can create a file called<br />
$HOME/.irods/.irodsEnv<br />
containing<br />
irodsHome '/ssUppnexZone/proj/myproj'<br />
irodsCwd '/ssUppnexZone/proj/myproj'<br />
where<br />
myproj<br />
is replaced by your project id.<br />
Read more info about the respective i-commands with "[command] -h"</div>Tom Langborg (NSC)https://snicdocs.nsc.liu.se/w/index.php?title=Swestore-irods&diff=5523Swestore-irods2013-10-31T11:52:17Z<p>Tom Langborg (NSC): </p>
<hr />
<div>[[Category:Storage]]<br />
[[Category:SweStore]]<br />
<br />
'''This is not official yet'''<br />
<br />
SNIC is building a storage infrastructure to complement the computational resources.<br />
<br />
Many forms of automated measurements can produce large amounts of data. In scientific areas such as high energy physics (the Large Hadron Collider at CERN), climate modeling, bioinformatics, bioimaging etc., the demands for storage are increasing dramatically. To serve these and other user communities, SNIC has appointed a working group to design a storage strategy, taking into account the needs on many levels and creating a unified storage infrastructure, which is now being implemented.<br />
<br />
Swestore is in collaboration with [http://www.ecds.se/ ECDS], [http://snd.gu.se/ SND], Bioimage Sweden, [http://www.bils.se/ BILS], [http://www.uppnex.uu.se/ UPPNEX],[http://wlcg.web.cern.ch/ WLCG], [http://www.nrm.se/ NaturHistoriska RiksMuseet].<br />
<br />
= National storage =<br />
The Swestore Nationally Accessible Storage, commonly called just Swestore, is a robust, flexible and expandable long term storage system aimed at storing large amounts of data produced by various Swedish research projects. It is based on the [http://www.dcache.org dCache] and [http://www.irods.org iRODS]<br />
storage systems.<br />
<br />
Swestore is distributed across the SNIC centres [http://www.c3se.chalmers.se/ C3SE], [http://www.hpc2n.umu.se/ HPC2N], [http://www.lunarc.lu.se/ Lunarc], [http://www.nsc.liu.se/ NSC], [http://www.pdc.kth.se PDC] and [http://www.uppmax.uu.se Uppmax]. Data is stored in two copies with each copy at a different SNIC centre. This enables the system to cope with a multitude of issues ranging from a simple crash of a storage element to losing an entire site while still providing access to the stored data. <br />
<br />
One of the major advantages to the distributed nature of dCache and iRODS is the excellent aggregated transfer rates possible. This is achieved by bypassing a central node and having transfers going directly to/from the storage elements if the protocol allows it. The Swestore Nationally Accessible Storage system can achieve aggregated transfer rates in excess of 100 Gigabit per second, but in practice this is limited by connectivity to each University (usually 10 Gbit/s) or a limited number of files (typically<br />
max 1 Gbit/s per file/connection).<br />
<br />
<br />
== Getting access ==<br />
; Apply for storage<br />
: Please follow the instructions on the [[Apply for storage on SweStore]] page.<br />
<br />
== Support == <br />
<br />
If you have any issues using SweStore please do not hesitate to contact [mailto:support@swestore.se support@swestore.se].<br />
<br />
==Difference between dCache and iRODS==<br />
<br />
== dCache ==<br />
To protect against silent data corruption the dCache storage system checksums all stored data and periodically verifies the data using this checksum.<br />
<br />
The system does NOT yet provide protection against user errors like inadvertent file deletions and so on.<br />
<br />
=== Access protocols ===<br />
; Currently supported protocols<br />
: GridFTP - gsiftp://gsiftp.swestore.se/<br />
: Storage Resource Manager - srm://srm.swegrid.se/<br />
: Hypertext Transfer Protocol (read-only), Web Distributed Authoring and Versioning - http://webdav.swestore.se/ (unauthenticated), https://webdav.swestore.se/<br />
: NFS4.1<br />
<br />
For authentication eScience certificates are used, which provides a higher level of security than legacy username/password schemes.<br />
<br />
=== Acquire an eScience client certificate ===<br />
: Follow the instructions on [[Grid_certificates#Requesting_a_certificate|Requesting a certificate]] to get your client certificate. This step can be performed while waiting for the storage application to be approved and processed. Of course, if you already have a valid eScience certificate you don't need to acquire another one.<br />
:; For Terena certificates<br />
:: If intending to access SweStore from a SNIC resource, please make sure you also [[Exporting_a_client_certificate|export the certificate]], transfer it to the intended SNIC resource and [[Preparing_a_client_certificate|prepare it for use with grid tools]] (not necessarily needed with ARC 3.x, see [[Grid_certificates#Creating_a_proxy_certificate_using_the_Firefox.2FThunderbird_credential_store|proxy certificates using Firefox credential store]]).<br />
:; For Nordugrid certificates<br />
:: Please make sure to also [[Requesting_a_grid_certificate_from_the_Nordugrid_CA#Installing_the_certificate_in_your_browser|install your client certificate in your browser]].<br />
; Request membership in the SweGrid VO<br />
: Follow the instructions on [[Grid_certificates#Requesting_membership_in_the_SweGrid_VO|Requesting membership in the SweGrid VO]] to get added to the SweGrid Virtual Organisation (VO) and request membership to your allocated storage project.<br />
<br />
=== Download and upload data ===<br />
; Interactive browsing and manipulation of single files<br />
: SweStore is accessible in your web browser in two ways, as a directory index interface at https://webdav.swestore.se/ and with an interactive file manager at https://webdav.swestore.se/browser/. '''Note''' that the interactive file manager has a lot of features and functions not supported in SweStore, only the basic file transfer features are supported.<br />
: To browse private data you need to have your certificate installed in your browser (default with Terena certificates, see above). Projects are organized under the <code>/snic</code> directory as <code><nowiki>https://webdav.swestore.se/snic/YOUR_PROJECT_NAME/</nowiki></code>.<br />
; Upload and delete data interactively or with automation<br />
There are several tools that are capable of using the protocols provided by SweStore national storage.<br />
For interactive usage on SNIC clusters we recommend using the ARC tools which should be installed on all SNIC resources.<br />
As an integration point for building scripts and automated systems we suggest using the curl program and library.<br />
: Use the ARC client. Please see the instructions for [[Accessing SweStore national storage with the ARC client]]. '''Recommended''' method when logged in on SNIC resources.<br />
: Use lftp. Please see the instructions for [[Accessing SweStore national storage with lftp]].<br />
: Use cURL. Please see the instructions for [[Accessing SweStore national storage with cURL]].<br />
: Use globus-url-copy. Please see the instructions for [[Accessing SweStore national storage with globus-url-copy]].<br />
<br />
=== Tools and scripts ===<br />
<br />
There exists a number of tools and utilities developed externally that can be useful. Here are some links:<br />
<br />
* [https://github.com/samuell/arc_tools ARC_Tools] - Convenience scripts for the arc client (Only a recursive rmdir so far).<br />
* [http://sourceforge.net/projects/arc-gui-clients ARC Graphical Clients] - Contains the ARC Storage Explorer (SweStore supported development).<br />
* Transfer script, [[SweStore/swetrans_arc|swetrans_arc]], provided by Adam Peplinski / Philipp Schlatter<br />
* [http://www.nordugrid.org/documents/SWIG-wrapped-ARC-Python-API.pdf Documentation of the ARC Python API (PDF)]<br />
<br />
=== Slides and more ===<br />
<br />
[http://docs.snic.se/wiki/Swestore/Lund_Seminar_Apr18 Slides and material from seminar for Lund users on April 18th]<br />
<br />
=== Usage monitoring ===<br />
* [http://status.swestore.se/munin/monitor/monitor/ Per Project Monitoring of Swestore usage]<br />
<br />
== iRODS ==<br />
<br />
=== Acquire a SweStore YubiKey ===<br />
<br />
For authentication [http://www.yubico.com/products/yubikey-hardware/yubikey/ Yubikey] one-time passwords (OTP) are used. With a simple touch of a button, a 44 character one-time password is generated and sent to the system. <br />
<br />
When you apply for storage, please provide your email address and a physical address where the yubikey should be sent. <br><br />
<br />
=== Supported clients ===<br />
<br />
: iDrop web - Point your Web browser to [https://iweb.swestore.se iweb.swestore.se]<br />
: Command line client [http://eirods.org/download/ eirods icommands] [[SweStore/iRODS_icommand|How to use icommand on SNIC clusters]]<br />
<br />
=== The SweStore iRODS system ===<br />
<br />
The SweStore iRODS system at NSC and it is running on two<br />
physical servers as a collection of virtual machines.<br />
<br />
The iCAT server is dealing with the metadata. It is running<br />
a Postgres database which containts information about where<br />
to find any particular file in the system.<br />
<br />
There are four storage servers which got a small amount of<br />
local disk space and they use the dCACHE system via NFS4 to<br />
store larger amounts of data.<br />
<br />
=== Using the SweStore iRODS system ===<br />
<br />
Deailed documentation, papers and resources are available from<br />
the e-iRODS web site, http://www.eirods.org.<br />
<br />
Web site for the community iRODS is http://www.irods.org.<br />
<br />
To use the system you need to have the iRODS command line client installed or using iDROP web. <br />
For Unix systems the iRODS commandline client is available as an installable package for various<br />
Linux platforms from the e-iRODS website downloads section.<br />
<br />
The community iRODS client also should work, but you need to modify configuration (iRODS/config/config.mk):<br />
<pre><br />
PAM_AUTH = 1<br />
PAM_AUTH_NO_EXTEND = 1<br />
USE_SSL = 1 <br />
</pre><br />
<br />
==== Command line client ====<br />
<br />
The command line client is natural to use for Unix users.<br />
There are versions of the usual ls, rm, mv, mkdir, pwd, rsync<br />
commands prefixed with an i for iRODS, i.e. irm, imv, imkdir etc.<br />
<br />
As expected iput and iget move files to and from the irods system.<br />
All these commands print short help when using the -h option.<br />
<br />
To use these first we need to initialize the iRODS environment.<br />
There is an environment file .irodsEnv in the .irods subdirectory<br />
of the home directory which contains information where and how<br />
to access the iRODS metadata (iCAT) server.<br />
<br />
It looks like (placeholders are in <>):<br />
<pre><br />
irodsHost 'irods.swestore.se'<br />
irodsPort 1247<br />
irodsDefResource 'snicdefResc'<br />
irodsHome '/snicZone/home/<email address>'<br />
irodsCwd '/snicZone/home/<email address>'<br />
irodsUserName '<email address>'<br />
irodsZone 'snicZone'<br />
irodsAuthScheme 'PAM'<br />
</pre><br />
<br />
The iCAT server is irods.swestore.se.<br />
The default irods zone name is snicZone.<br />
The default resource is snicdefResc.<br />
<br />
With the corrent environment file all we need is a Yubkey and we can run the iinit command to authenticate to the iCAT server. After that we can use the usual iCommands. The ticket is valid 8 hrs.<br />
<br />
More details on the i commands are available at<br />
https://www.irods.org/index.php/icommands<br />
<br />
==== iDROP web client ====<br />
<br />
The web client is accessible via the URL https://iweb.swestore.se/.<br />
A login screen will be presented first and your Yubikey should<br />
be used to log in.<br />
<br />
= Center storage =<br />
Centre storage, as defined by the SNIC storage group, is a storage solution that lives independently of the computational resources and can be accessed from all such resources at a centre. Key features include the ability to access the same filesystem the same way on all computational resources at a centre, and a unified structure and nomenclature for all centra. Unlike cluster storage which is tightly associated with a single cluster, and thus has a limited life-time, centre storage does not require the users to migrate their own data when clusters are decommissioned, not even when the storage hardware itself is being replaced.<br />
<br />
== Unified environment ==<br />
To make the usage more transparent for SNIC users, a set of environment variables are available on all SNIC resources:<br />
<br />
* <code>SNIC_BACKUP</code> – the user's primary directory at the centre<br>(the part of the centre storage that is backed up)<br />
* <code>SNIC_NOBACKUP</code> – recommended directory for project storage without backup<br>(also on the centre storage)<br />
* <code>SNIC_TMP</code> – recommended directory for best performance during a job<br>(local disk on nodes if applicable)</div>Tom Langborg (NSC)https://snicdocs.nsc.liu.se/w/index.php?title=SweStore/iRODS_icommand&diff=5520SweStore/iRODS icommand2013-10-31T10:39:38Z<p>Tom Langborg (NSC): </p>
<hr />
<div>For accessing the Swestore national storage, we use (since Nov 2013 iRODS, and more specifically the iRODS client called "icommands", which is a set of commands similar to the unix commands "ls", "cd", "imkdir" etc, but with an "i" in front, plus two FTP-like commands: "iput" and "iget" (plus some more, but those are the ones you need most).<br />
<br />
Note that data put on Swestore via iRODS is not accessible through other means.<br />
<br />
Activate the iRODS icommands<br />
The iRODS icommands client is activated through the module system, as so many other things on SNIC clusters.<br />
<br />
1. Log in to SNIC cluster<br />
2. Execute:<br />
<br />
module load irods<br />
iinit <br />
After activating the iRODS icommands, you will be placed in one of your projects. if you run "ils", you will then see a listing of files and folders in that project, something like this:<br />
<br />
[samuel@kalkyl4 ~]$ ils<br />
/ssUppnexZone/proj/b2011221:<br />
C- /ssUppnexZone/proj/b2011221/firstRun<br />
Navigate around<br />
To enter one of the (or the only) folder(s), do:<br />
<br />
icd [proj-id]<br />
... in this case:<br />
<br />
icd firstrun<br />
To switch to another project, use<br />
<br />
icd ..<br />
to back up a level. After that, you can change to another project or folder using icd as previously<br />
<br />
icd b2011222<br />
Upload files<br />
To upload the above mentioned folder, do (-r is needed to recurse into directories):<br />
<br />
iput -r [a local folder]<br />
For single files, the -r flag is not needed:<br />
<br />
iput [a local file]<br />
To list the newly uploaded file/directory:<br />
<br />
ils<br />
To create folders, do:<br />
<br />
imkdir [folder-name]<br />
If you want to verify the upload outside of iRODS, there's a utility called ssverify.sh that will let you do that.<br />
<br />
Downloading files<br />
To download a file again, do:<br />
<br />
iget [a file in iRODS]<br />
... or, for folders, do it recursively:<br />
<br />
iget -r [a folder in iRODS]<br />
File removal<br />
To remove a file, you'll like need to use -f to irm in order to bypass the use of a trash area, which is currently not supported at Uppmax.<br />
<br />
More hints<br />
You can have iput show the progress with the -P flag<br />
If you want to be placed in a particular directory when you load the irods module, you can create a file called<br />
$HOME/.irods/.irodsEnv<br />
containing<br />
irodsHome '/ssUppnexZone/proj/myproj'<br />
irodsCwd '/ssUppnexZone/proj/myproj'<br />
where<br />
myproj<br />
is replaced by your project id.<br />
Read more info about the respective i-commands with "[command] -h"</div>Tom Langborg (NSC)https://snicdocs.nsc.liu.se/w/index.php?title=SweStore/iRODS_icommand&diff=5519SweStore/iRODS icommand2013-10-31T10:36:24Z<p>Tom Langborg (NSC): Created page with "For accessing the Swestore national storage, we use (since Nov 2013 iRODS, and more specifically the iRODS client called "icommands", which is a set of commands similar to the un..."</p>
<hr />
<div>For accessing the Swestore national storage, we use (since Nov 2013 iRODS, and more specifically the iRODS client called "icommands", which is a set of commands similar to the unix commands "ls", "cd", "imkdir" etc, but with an "i" in front, plus two FTP-like commands: "iput" and "iget" (plus some more, but those are the ones you need most).<br />
<br />
Note that data put on Swestore via iRODS is not accessible through other means.<br />
<br />
Activate the iRODS icommands<br />
The iRODS icommands client is activated through the module system, as so many other things on SNIC clusters.<br />
<br />
1. Log in to SNIC cluster<br />
2. Execute:<br />
<br />
module load irods<br />
iinit <br />
After activating the iRODS icommands, you will be placed in one of your projects. if you run "ils", you will then see a listing of files and folders in that project, something like this:<br />
<br />
[samuel@kalkyl4 ~]$ ils<br />
/ssUppnexZone/proj/b2011221:<br />
C- /ssUppnexZone/proj/b2011221/firstRun<br />
Navigate around<br />
To enter one of the (or the only) folder(s), do:<br />
<br />
icd [proj-id]<br />
... in this case:<br />
<br />
icd firstrun<br />
To switch to another project, use<br />
<br />
icd ..<br />
to back up a level. After that, you can change to another project or folder using icd as previously<br />
<br />
icd b2011222<br />
Upload files<br />
To upload the above mentioned folder, do (-r is needed to recurse into directories):<br />
<br />
iput -r [a local folder]<br />
For single files, the -r flag is not needed:<br />
<br />
iput [a local file]<br />
To list the newly uploaded file/directory:<br />
<br />
ils<br />
To create folders, do:<br />
<br />
imkdir [folder-name]<br />
If you want to verify the upload outside of iRODS, there's a utility called ssverify.sh that will let you do that.<br />
<br />
$ ssverify.sh test /ssUppnexZone/proj/p2061001/test<br />
Checking if test/a matches /ssUppnexZone/proj/p2061001/test/a... seems good!<br />
Checking if test/b matches /ssUppnexZone/proj/p2061001/test/b... seems good!<br />
Checking if test/c matches /ssUppnexZone/proj/p2061001/test/c... seems good!<br />
Checking if test/du matches /ssUppnexZone/proj/p2061001/test/du... seems good!<br />
Checking if test/foo matches /ssUppnexZone/proj/p2061001/test/foo... seems good!<br />
It seems test matches /ssUppnexZone/proj/p2061001/test on Swestore.<br />
$ echo femtiofyra > test/c<br />
$ ssverify.sh test /ssUppnexZone/proj/p2061001/test<br />
Checking if test/a matches /ssUppnexZone/proj/p2061001/test/a... seems good!<br />
Checking if test/b matches /ssUppnexZone/proj/p2061001/test/b... seems good!<br />
Checking if test/c matches /ssUppnexZone/proj/p2061001/test/c... NO! Something is wrong! <br />
Either the files don't match or something went wrong, please contact <br />
staff@uppmax.uu.se to investigate.<br />
Downloading files<br />
To download a file again, do:<br />
<br />
iget [a file in iRODS]<br />
... or, for folders, do it recursively:<br />
<br />
iget -r [a folder in iRODS]<br />
File removal<br />
To remove a file, you'll like need to use -f to irm in order to bypass the use of a trash area, which is currently not supported at Uppmax.<br />
<br />
More hints<br />
You can have iput show the progress with the -P flag<br />
If you want to be placed in a particular directory when you load the irods module, you can create a file called<br />
$HOME/.irods/.irodsEnv<br />
containing<br />
irodsHome '/ssUppnexZone/proj/myproj'<br />
irodsCwd '/ssUppnexZone/proj/myproj'<br />
where<br />
myproj<br />
is replaced by your project id.<br />
Read more info about the respective i-commands with "[command] -h"</div>Tom Langborg (NSC)https://snicdocs.nsc.liu.se/w/index.php?title=Swestore-irods&diff=5518Swestore-irods2013-10-31T10:32:05Z<p>Tom Langborg (NSC): /* Supported clients */</p>
<hr />
<div>[[Category:Storage]]<br />
[[Category:SweStore]]<br />
<br />
'''This is not official yet'''<br />
<br />
SNIC is building a storage infrastructure to complement the computational resources.<br />
<br />
Many forms of automated measurements can produce large amounts of data. In scientific areas such as high energy physics (the Large Hadron Collider at CERN), climate modeling, bioinformatics, bioimaging etc., the demands for storage are increasing dramatically. To serve these and other user communities, SNIC has appointed a working group to design a storage strategy, taking into account the needs on many levels and creating a unified storage infrastructure, which is now being implemented.<br />
<br />
Swestore is in collaboration with [http://www.ecds.se/ ECDS], [http://snd.gu.se/ SND], Bioimage Sweden, [http://www.bils.se/ BILS], [http://www.uppnex.uu.se/ UPPNEX],[http://wlcg.web.cern.ch/ WLCG], [http://www.nrm.se/ NaturHistoriska RiksMuseet].<br />
<br />
= National storage =<br />
The Swestore Nationally Accessible Storage, commonly called just Swestore, is a robust, flexible and expandable long term storage system aimed at storing large amounts of data produced by various Swedish research projects. It is based on the [http://www.dcache.org dCache] and [http://www.irods.org iRODS]<br />
storage systems.<br />
<br />
Swestore is distributed across the SNIC centres [http://www.c3se.chalmers.se/ C3SE], [http://www.hpc2n.umu.se/ HPC2N], [http://www.lunarc.lu.se/ Lunarc], [http://www.nsc.liu.se/ NSC], [http://www.pdc.kth.se PDC] and [http://www.uppmax.uu.se Uppmax]. Data is stored in two copies with each copy at a different SNIC centre. This enables the system to cope with a multitude of issues ranging from a simple crash of a storage element to losing an entire site while still providing access to the stored data. <br />
<br />
One of the major advantages to the distributed nature of dCache and iRODS is the excellent aggregated transfer rates possible. This is achieved by bypassing a central node and having transfers going directly to/from the storage elements if the protocol allows it. The Swestore Nationally Accessible Storage system can achieve aggregated transfer rates in excess of 100 Gigabit per second, but in practice this is limited by connectivity to each University (usually 10 Gbit/s) or a limited number of files (typically<br />
max 1 Gbit/s per file/connection).<br />
<br />
== dCache ==<br />
To protect against silent data corruption the dCache storage system checksums all stored data and periodically verifies the data using this checksum.<br />
<br />
The system does NOT yet provide protection against user errors like inadvertent file deletions and so on.<br />
<br />
=== Access protocols ===<br />
; Currently supported protocols<br />
: GridFTP - gsiftp://gsiftp.swestore.se/<br />
: Storage Resource Manager - srm://srm.swegrid.se/<br />
: Hypertext Transfer Protocol (read-only), Web Distributed Authoring and Versioning - http://webdav.swestore.se/ (unauthenticated), https://webdav.swestore.se/<br />
: NFS4.1<br />
<br />
For authentication eScience certificates are used, which provides a higher level of security than legacy username/password schemes.<br />
<br />
=== Acquire an eScience client certificate ===<br />
: Follow the instructions on [[Grid_certificates#Requesting_a_certificate|Requesting a certificate]] to get your client certificate. This step can be performed while waiting for the storage application to be approved and processed. Of course, if you already have a valid eScience certificate you don't need to acquire another one.<br />
:; For Terena certificates<br />
:: If intending to access SweStore from a SNIC resource, please make sure you also [[Exporting_a_client_certificate|export the certificate]], transfer it to the intended SNIC resource and [[Preparing_a_client_certificate|prepare it for use with grid tools]] (not necessarily needed with ARC 3.x, see [[Grid_certificates#Creating_a_proxy_certificate_using_the_Firefox.2FThunderbird_credential_store|proxy certificates using Firefox credential store]]).<br />
:; For Nordugrid certificates<br />
:: Please make sure to also [[Requesting_a_grid_certificate_from_the_Nordugrid_CA#Installing_the_certificate_in_your_browser|install your client certificate in your browser]].<br />
; Request membership in the SweGrid VO<br />
: Follow the instructions on [[Grid_certificates#Requesting_membership_in_the_SweGrid_VO|Requesting membership in the SweGrid VO]] to get added to the SweGrid Virtual Organisation (VO) and request membership to your allocated storage project.<br />
<br />
=== Download and upload data ===<br />
; Interactive browsing and manipulation of single files<br />
: SweStore is accessible in your web browser in two ways, as a directory index interface at https://webdav.swestore.se/ and with an interactive file manager at https://webdav.swestore.se/browser/. '''Note''' that the interactive file manager has a lot of features and functions not supported in SweStore, only the basic file transfer features are supported.<br />
: To browse private data you need to have your certificate installed in your browser (default with Terena certificates, see above). Projects are organized under the <code>/snic</code> directory as <code><nowiki>https://webdav.swestore.se/snic/YOUR_PROJECT_NAME/</nowiki></code>.<br />
; Upload and delete data interactively or with automation<br />
There are several tools that are capable of using the protocols provided by SweStore national storage.<br />
For interactive usage on SNIC clusters we recommend using the ARC tools which should be installed on all SNIC resources.<br />
As an integration point for building scripts and automated systems we suggest using the curl program and library.<br />
: Use the ARC client. Please see the instructions for [[Accessing SweStore national storage with the ARC client]]. '''Recommended''' method when logged in on SNIC resources.<br />
: Use lftp. Please see the instructions for [[Accessing SweStore national storage with lftp]].<br />
: Use cURL. Please see the instructions for [[Accessing SweStore national storage with cURL]].<br />
: Use globus-url-copy. Please see the instructions for [[Accessing SweStore national storage with globus-url-copy]].<br />
<br />
=== Tools and scripts ===<br />
<br />
There exists a number of tools and utilities developed externally that can be useful. Here are some links:<br />
<br />
* [https://github.com/samuell/arc_tools ARC_Tools] - Convenience scripts for the arc client (Only a recursive rmdir so far).<br />
* [http://sourceforge.net/projects/arc-gui-clients ARC Graphical Clients] - Contains the ARC Storage Explorer (SweStore supported development).<br />
* Transfer script, [[SweStore/swetrans_arc|swetrans_arc]], provided by Adam Peplinski / Philipp Schlatter<br />
* [http://www.nordugrid.org/documents/SWIG-wrapped-ARC-Python-API.pdf Documentation of the ARC Python API (PDF)]<br />
<br />
=== Slides and more ===<br />
<br />
[http://docs.snic.se/wiki/Swestore/Lund_Seminar_Apr18 Slides and material from seminar for Lund users on April 18th]<br />
<br />
=== Usage monitoring ===<br />
* [http://status.swestore.se/munin/monitor/monitor/ Per Project Monitoring of Swestore usage]<br />
<br />
== iRODS ==<br />
<br />
=== Acquire a SweStore YubiKey ===<br />
<br />
For authentication [http://www.yubico.com/products/yubikey-hardware/yubikey/ Yubikey] one-time passwords (OTP) are used. With a simple touch of a button, a 44 character one-time password is generated and sent to the system. <br />
<br />
When you apply for storage, please provide your email address and a physical address where the yubikey should be sent. <br><br />
<br />
=== Supported clients ===<br />
<br />
: iDrop web - Point your Web browser to [https://iweb.swestore.se iweb.swestore.se]<br />
: Command line client [http://eirods.org/download/ eirods icommands] [[SweStore/iRODS_icommand|How to use icommand]]<br />
<br />
=== The SweStore iRODS system ===<br />
<br />
The SweStore iRODS system at NSC and it is running on two<br />
physical servers as a collection of virtual machines.<br />
<br />
The iCAT server is dealing with the metadata. It is running<br />
a Postgres database which containts information about where<br />
to find any particular file in the system.<br />
<br />
There are four storage servers which got a small amount of<br />
local disk space and they use the dCACHE system via NFS4 to<br />
store larger amounts of data.<br />
<br />
=== Using the SweStore iRODS system ===<br />
<br />
Deailed documentation, papers and resources are available from<br />
the e-iRODS web site, http://www.eirods.org.<br />
<br />
Web site for the community iRODS is http://www.irods.org.<br />
<br />
To use the system you need to have the iRODS command line client installed or using iDROP web. <br />
For Unix systems the iRODS commandline client is available as an installable package for various<br />
Linux platforms from the e-iRODS website downloads section.<br />
<br />
The community iRODS client also should work, but you need to modify configuration (iRODS/config/config.mk):<br />
<pre><br />
PAM_AUTH = 1<br />
PAM_AUTH_NO_EXTEND = 1<br />
USE_SSL = 1 <br />
</pre><br />
<br />
==== Command line client ====<br />
<br />
The command line client is natural to use for Unix users.<br />
There are versions of the usual ls, rm, mv, mkdir, pwd, rsync<br />
commands prefixed with an i for iRODS, i.e. irm, imv, imkdir etc.<br />
<br />
As expected iput and iget move files to and from the irods system.<br />
All these commands print short help when using the -h option.<br />
<br />
To use these first we need to initialize the iRODS environment.<br />
There is an environment file .irodsEnv in the .irods subdirectory<br />
of the home directory which contains information where and how<br />
to access the iRODS metadata (iCAT) server.<br />
<br />
It looks like (placeholders are in <>):<br />
<pre><br />
irodsHost 'irods.swestore.se'<br />
irodsPort 1247<br />
irodsDefResource 'snicdefResc'<br />
irodsHome '/snicZone/home/<email address>'<br />
irodsCwd '/snicZone/home/<email address>'<br />
irodsUserName '<email address>'<br />
irodsZone 'snicZone'<br />
irodsAuthScheme 'PAM'<br />
</pre><br />
<br />
The iCAT server is irods.swestore.se.<br />
The default irods zone name is snicZone.<br />
The default resource is snicdefResc.<br />
<br />
With the corrent environment file all we need is a Yubkey and we can run the iinit command to authenticate to the iCAT server. After that we can use the usual iCommands. The ticket is valid 8 hrs.<br />
<br />
More details on the i commands are available at<br />
https://www.irods.org/index.php/icommands<br />
<br />
==== iDROP web client ====<br />
<br />
The web client is accessible via the URL https://iweb.swestore.se/.<br />
A login screen will be presented first and your Yubikey should<br />
be used to log in.<br />
<br />
== Getting access ==<br />
; Apply for storage<br />
: Please follow the instructions on the [[Apply for storage on SweStore]] page.<br />
<br />
== Support == <br />
<br />
If you have any issues using SweStore please do not hesitate to contact [mailto:support@swestore.se support@swestore.se].<br />
<br />
= Center storage =<br />
Centre storage, as defined by the SNIC storage group, is a storage solution that lives independently of the computational resources and can be accessed from all such resources at a centre. Key features include the ability to access the same filesystem the same way on all computational resources at a centre, and a unified structure and nomenclature for all centra. Unlike cluster storage which is tightly associated with a single cluster, and thus has a limited life-time, centre storage does not require the users to migrate their own data when clusters are decommissioned, not even when the storage hardware itself is being replaced.<br />
<br />
== Unified environment ==<br />
To make the usage more transparent for SNIC users, a set of environment variables are available on all SNIC resources:<br />
<br />
* <code>SNIC_BACKUP</code> – the user's primary directory at the centre<br>(the part of the centre storage that is backed up)<br />
* <code>SNIC_NOBACKUP</code> – recommended directory for project storage without backup<br>(also on the centre storage)<br />
* <code>SNIC_TMP</code> – recommended directory for best performance during a job<br>(local disk on nodes if applicable)</div>Tom Langborg (NSC)https://snicdocs.nsc.liu.se/w/index.php?title=Swestore-irods&diff=5509Swestore-irods2013-10-31T09:19:41Z<p>Tom Langborg (NSC): /* Apply for YubiKey */</p>
<hr />
<div>[[Category:Storage]]<br />
[[Category:SweStore]]<br />
<br />
'''This is not official yet'''<br />
<br />
SNIC is building a storage infrastructure to complement the computational resources.<br />
<br />
Many forms of automated measurements can produce large amounts of data. In scientific areas such as high energy physics (the Large Hadron Collider at CERN), climate modeling, bioinformatics, bioimaging etc., the demands for storage are increasing dramatically. To serve these and other user communities, SNIC has appointed a working group to design a storage strategy, taking into account the needs on many levels and creating a unified storage infrastructure, which is now being implemented.<br />
<br />
Swestore is in collaboration with [http://www.ecds.se/ ECDS], [http://snd.gu.se/ SND], Bioimage Sweden, [http://www.bils.se/ BILS], [http://www.uppnex.uu.se/ UPPNEX],[http://wlcg.web.cern.ch/ WLCG], [http://www.nrm.se/ NaturHistoriska RiksMuseet].<br />
<br />
= National storage =<br />
The Swestore Nationally Accessible Storage, commonly called just Swestore, is a robust, flexible and expandable long term storage system aimed at storing large amounts of data produced by various Swedish research projects. It is based on the [http://www.dcache.org dCache] and [http://www.irods.org iRODS]<br />
storage systems.<br />
<br />
Swestore is distributed across the SNIC centres [http://www.c3se.chalmers.se/ C3SE], [http://www.hpc2n.umu.se/ HPC2N], [http://www.lunarc.lu.se/ Lunarc], [http://www.nsc.liu.se/ NSC], [http://www.pdc.kth.se PDC] and [http://www.uppmax.uu.se Uppmax]. Data is stored in two copies with each copy at a different SNIC centre. This enables the system to cope with a multitude of issues ranging from a simple crash of a storage element to losing an entire site while still providing access to the stored data. <br />
<br />
One of the major advantages to the distributed nature of dCache and iRODS is the excellent aggregated transfer rates possible. This is achieved by bypassing a central node and having transfers going directly to/from the storage elements if the protocol allows it. The Swestore Nationally Accessible Storage system can achieve aggregated transfer rates in excess of 100 Gigabit per second, but in practice this is limited by connectivity to each University (usually 10 Gbit/s) or a limited number of files (typically<br />
max 1 Gbit/s per file/connection).<br />
<br />
== dCache ==<br />
To protect against silent data corruption the dCache storage system checksums all stored data and periodically verifies the data using this checksum.<br />
<br />
The system does NOT yet provide protection against user errors like inadvertent file deletions and so on.<br />
<br />
=== Access protocols ===<br />
; Currently supported protocols<br />
: GridFTP - gsiftp://gsiftp.swestore.se/<br />
: Storage Resource Manager - srm://srm.swegrid.se/<br />
: Hypertext Transfer Protocol (read-only), Web Distributed Authoring and Versioning - http://webdav.swestore.se/ (unauthenticated), https://webdav.swestore.se/<br />
: NFS4.1<br />
<br />
For authentication eScience certificates are used, which provides a higher level of security than legacy username/password schemes.<br />
<br />
=== Acquire an eScience client certificate ===<br />
: Follow the instructions on [[Grid_certificates#Requesting_a_certificate|Requesting a certificate]] to get your client certificate. This step can be performed while waiting for the storage application to be approved and processed. Of course, if you already have a valid eScience certificate you don't need to acquire another one.<br />
:; For Terena certificates<br />
:: If intending to access SweStore from a SNIC resource, please make sure you also [[Exporting_a_client_certificate|export the certificate]], transfer it to the intended SNIC resource and [[Preparing_a_client_certificate|prepare it for use with grid tools]] (not necessarily needed with ARC 3.x, see [[Grid_certificates#Creating_a_proxy_certificate_using_the_Firefox.2FThunderbird_credential_store|proxy certificates using Firefox credential store]]).<br />
:; For Nordugrid certificates<br />
:: Please make sure to also [[Requesting_a_grid_certificate_from_the_Nordugrid_CA#Installing_the_certificate_in_your_browser|install your client certificate in your browser]].<br />
; Request membership in the SweGrid VO<br />
: Follow the instructions on [[Grid_certificates#Requesting_membership_in_the_SweGrid_VO|Requesting membership in the SweGrid VO]] to get added to the SweGrid Virtual Organisation (VO) and request membership to your allocated storage project.<br />
<br />
=== Download and upload data ===<br />
; Interactive browsing and manipulation of single files<br />
: SweStore is accessible in your web browser in two ways, as a directory index interface at https://webdav.swestore.se/ and with an interactive file manager at https://webdav.swestore.se/browser/. '''Note''' that the interactive file manager has a lot of features and functions not supported in SweStore, only the basic file transfer features are supported.<br />
: To browse private data you need to have your certificate installed in your browser (default with Terena certificates, see above). Projects are organized under the <code>/snic</code> directory as <code><nowiki>https://webdav.swestore.se/snic/YOUR_PROJECT_NAME/</nowiki></code>.<br />
; Upload and delete data interactively or with automation<br />
There are several tools that are capable of using the protocols provided by SweStore national storage.<br />
For interactive usage on SNIC clusters we recommend using the ARC tools which should be installed on all SNIC resources.<br />
As an integration point for building scripts and automated systems we suggest using the curl program and library.<br />
: Use the ARC client. Please see the instructions for [[Accessing SweStore national storage with the ARC client]]. '''Recommended''' method when logged in on SNIC resources.<br />
: Use lftp. Please see the instructions for [[Accessing SweStore national storage with lftp]].<br />
: Use cURL. Please see the instructions for [[Accessing SweStore national storage with cURL]].<br />
: Use globus-url-copy. Please see the instructions for [[Accessing SweStore national storage with globus-url-copy]].<br />
<br />
=== Tools and scripts ===<br />
<br />
There exists a number of tools and utilities developed externally that can be useful. Here are some links:<br />
<br />
* [https://github.com/samuell/arc_tools ARC_Tools] - Convenience scripts for the arc client (Only a recursive rmdir so far).<br />
* [http://sourceforge.net/projects/arc-gui-clients ARC Graphical Clients] - Contains the ARC Storage Explorer (SweStore supported development).<br />
* Transfer script, [[SweStore/swetrans_arc|swetrans_arc]], provided by Adam Peplinski / Philipp Schlatter<br />
* [http://www.nordugrid.org/documents/SWIG-wrapped-ARC-Python-API.pdf Documentation of the ARC Python API (PDF)]<br />
<br />
=== Slides and more ===<br />
<br />
[http://docs.snic.se/wiki/Swestore/Lund_Seminar_Apr18 Slides and material from seminar for Lund users on April 18th]<br />
<br />
=== Usage monitoring ===<br />
* [http://status.swestore.se/munin/monitor/monitor/ Per Project Monitoring of Swestore usage]<br />
<br />
== iRODS ==<br />
<br />
=== Apply for YubiKey ===<br />
<br />
For authentication [http://www.yubico.com/products/yubikey-hardware/yubikey/ Yubikey] one-time passwords (OTP) are used. With a simple touch of a button, a 32 character one-time password is generated and sent to the system. <br />
<br />
When you apply for storage, please provide your email address and a physical address where the yubikey should be sent. <br><br />
<br />
=== Supported clients ===<br />
<br />
: CLI client [http://eirods.org/download/ eirods icommands]<br />
: Point your web browser to [https://iweb.swestore.se iweb.swestore.se]<br />
: A desktop java client (iDrop) will soon be available.<br />
<br />
=== Using SweStore iRODS ===<br />
<br />
== Getting access ==<br />
; Apply for storage<br />
: Please follow the instructions on the [[Apply for storage on SweStore]] page.<br />
<br />
== Support == <br />
<br />
If you have any issues using SweStore please do not hesitate to contact [mailto:support@swestore.se support@swestore.se].<br />
<br />
= Center storage =<br />
Centre storage, as defined by the SNIC storage group, is a storage solution that lives independently of the computational resources and can be accessed from all such resources at a centre. Key features include the ability to access the same filesystem the same way on all computational resources at a centre, and a unified structure and nomenclature for all centra. Unlike cluster storage which is tightly associated with a single cluster, and thus has a limited life-time, centre storage does not require the users to migrate their own data when clusters are decommissioned, not even when the storage hardware itself is being replaced.<br />
<br />
== Unified environment ==<br />
To make the usage more transparent for SNIC users, a set of environment variables are available on all SNIC resources:<br />
<br />
* <code>SNIC_BACKUP</code> – the user's primary directory at the centre<br>(the part of the centre storage that is backed up)<br />
* <code>SNIC_NOBACKUP</code> – recommended directory for project storage without backup<br>(also on the centre storage)<br />
* <code>SNIC_TMP</code> – recommended directory for best performance during a job<br>(local disk on nodes if applicable)</div>Tom Langborg (NSC)https://snicdocs.nsc.liu.se/w/index.php?title=Swestore-irods&diff=5508Swestore-irods2013-10-31T09:18:27Z<p>Tom Langborg (NSC): /* Apply for YubiKey */</p>
<hr />
<div>[[Category:Storage]]<br />
[[Category:SweStore]]<br />
<br />
'''This is not official yet'''<br />
<br />
SNIC is building a storage infrastructure to complement the computational resources.<br />
<br />
Many forms of automated measurements can produce large amounts of data. In scientific areas such as high energy physics (the Large Hadron Collider at CERN), climate modeling, bioinformatics, bioimaging etc., the demands for storage are increasing dramatically. To serve these and other user communities, SNIC has appointed a working group to design a storage strategy, taking into account the needs on many levels and creating a unified storage infrastructure, which is now being implemented.<br />
<br />
Swestore is in collaboration with [http://www.ecds.se/ ECDS], [http://snd.gu.se/ SND], Bioimage Sweden, [http://www.bils.se/ BILS], [http://www.uppnex.uu.se/ UPPNEX],[http://wlcg.web.cern.ch/ WLCG], [http://www.nrm.se/ NaturHistoriska RiksMuseet].<br />
<br />
= National storage =<br />
The Swestore Nationally Accessible Storage, commonly called just Swestore, is a robust, flexible and expandable long term storage system aimed at storing large amounts of data produced by various Swedish research projects. It is based on the [http://www.dcache.org dCache] and [http://www.irods.org iRODS]<br />
storage systems.<br />
<br />
Swestore is distributed across the SNIC centres [http://www.c3se.chalmers.se/ C3SE], [http://www.hpc2n.umu.se/ HPC2N], [http://www.lunarc.lu.se/ Lunarc], [http://www.nsc.liu.se/ NSC], [http://www.pdc.kth.se PDC] and [http://www.uppmax.uu.se Uppmax]. Data is stored in two copies with each copy at a different SNIC centre. This enables the system to cope with a multitude of issues ranging from a simple crash of a storage element to losing an entire site while still providing access to the stored data. <br />
<br />
One of the major advantages to the distributed nature of dCache and iRODS is the excellent aggregated transfer rates possible. This is achieved by bypassing a central node and having transfers going directly to/from the storage elements if the protocol allows it. The Swestore Nationally Accessible Storage system can achieve aggregated transfer rates in excess of 100 Gigabit per second, but in practice this is limited by connectivity to each University (usually 10 Gbit/s) or a limited number of files (typically<br />
max 1 Gbit/s per file/connection).<br />
<br />
== dCache ==<br />
To protect against silent data corruption the dCache storage system checksums all stored data and periodically verifies the data using this checksum.<br />
<br />
The system does NOT yet provide protection against user errors like inadvertent file deletions and so on.<br />
<br />
=== Access protocols ===<br />
; Currently supported protocols<br />
: GridFTP - gsiftp://gsiftp.swestore.se/<br />
: Storage Resource Manager - srm://srm.swegrid.se/<br />
: Hypertext Transfer Protocol (read-only), Web Distributed Authoring and Versioning - http://webdav.swestore.se/ (unauthenticated), https://webdav.swestore.se/<br />
: NFS4.1<br />
<br />
For authentication eScience certificates are used, which provides a higher level of security than legacy username/password schemes.<br />
<br />
=== Acquire an eScience client certificate ===<br />
: Follow the instructions on [[Grid_certificates#Requesting_a_certificate|Requesting a certificate]] to get your client certificate. This step can be performed while waiting for the storage application to be approved and processed. Of course, if you already have a valid eScience certificate you don't need to acquire another one.<br />
:; For Terena certificates<br />
:: If intending to access SweStore from a SNIC resource, please make sure you also [[Exporting_a_client_certificate|export the certificate]], transfer it to the intended SNIC resource and [[Preparing_a_client_certificate|prepare it for use with grid tools]] (not necessarily needed with ARC 3.x, see [[Grid_certificates#Creating_a_proxy_certificate_using_the_Firefox.2FThunderbird_credential_store|proxy certificates using Firefox credential store]]).<br />
:; For Nordugrid certificates<br />
:: Please make sure to also [[Requesting_a_grid_certificate_from_the_Nordugrid_CA#Installing_the_certificate_in_your_browser|install your client certificate in your browser]].<br />
; Request membership in the SweGrid VO<br />
: Follow the instructions on [[Grid_certificates#Requesting_membership_in_the_SweGrid_VO|Requesting membership in the SweGrid VO]] to get added to the SweGrid Virtual Organisation (VO) and request membership to your allocated storage project.<br />
<br />
=== Download and upload data ===<br />
; Interactive browsing and manipulation of single files<br />
: SweStore is accessible in your web browser in two ways, as a directory index interface at https://webdav.swestore.se/ and with an interactive file manager at https://webdav.swestore.se/browser/. '''Note''' that the interactive file manager has a lot of features and functions not supported in SweStore, only the basic file transfer features are supported.<br />
: To browse private data you need to have your certificate installed in your browser (default with Terena certificates, see above). Projects are organized under the <code>/snic</code> directory as <code><nowiki>https://webdav.swestore.se/snic/YOUR_PROJECT_NAME/</nowiki></code>.<br />
; Upload and delete data interactively or with automation<br />
There are several tools that are capable of using the protocols provided by SweStore national storage.<br />
For interactive usage on SNIC clusters we recommend using the ARC tools which should be installed on all SNIC resources.<br />
As an integration point for building scripts and automated systems we suggest using the curl program and library.<br />
: Use the ARC client. Please see the instructions for [[Accessing SweStore national storage with the ARC client]]. '''Recommended''' method when logged in on SNIC resources.<br />
: Use lftp. Please see the instructions for [[Accessing SweStore national storage with lftp]].<br />
: Use cURL. Please see the instructions for [[Accessing SweStore national storage with cURL]].<br />
: Use globus-url-copy. Please see the instructions for [[Accessing SweStore national storage with globus-url-copy]].<br />
<br />
=== Tools and scripts ===<br />
<br />
There exists a number of tools and utilities developed externally that can be useful. Here are some links:<br />
<br />
* [https://github.com/samuell/arc_tools ARC_Tools] - Convenience scripts for the arc client (Only a recursive rmdir so far).<br />
* [http://sourceforge.net/projects/arc-gui-clients ARC Graphical Clients] - Contains the ARC Storage Explorer (SweStore supported development).<br />
* Transfer script, [[SweStore/swetrans_arc|swetrans_arc]], provided by Adam Peplinski / Philipp Schlatter<br />
* [http://www.nordugrid.org/documents/SWIG-wrapped-ARC-Python-API.pdf Documentation of the ARC Python API (PDF)]<br />
<br />
=== Slides and more ===<br />
<br />
[http://docs.snic.se/wiki/Swestore/Lund_Seminar_Apr18 Slides and material from seminar for Lund users on April 18th]<br />
<br />
=== Usage monitoring ===<br />
* [http://status.swestore.se/munin/monitor/monitor/ Per Project Monitoring of Swestore usage]<br />
<br />
== iRODS ==<br />
<br />
=== Apply for YubiKey ===<br />
<br />
For authentication [http://www.yubico.com/products/yubikey-hardware/yubikey/ Yubikey] one-time passwords (OTP) are used. With a simple touch of a button, a 32 character one-time password is generated and sent to the system. <br />
<br />
When you apply for storage, please provide your email address and a physical address where the yubikey should be sent. <br><br />
Yubikey is a usb keyboard emulator that give you one time password ever time you press the key.<br />
<br />
=== Supported clients ===<br />
<br />
: CLI client [http://eirods.org/download/ eirods icommands]<br />
: Point your web browser to [https://iweb.swestore.se iweb.swestore.se]<br />
: A desktop java client (iDrop) will soon be available.<br />
<br />
=== Using SweStore iRODS ===<br />
<br />
== Getting access ==<br />
; Apply for storage<br />
: Please follow the instructions on the [[Apply for storage on SweStore]] page.<br />
<br />
== Support == <br />
<br />
If you have any issues using SweStore please do not hesitate to contact [mailto:support@swestore.se support@swestore.se].<br />
<br />
= Center storage =<br />
Centre storage, as defined by the SNIC storage group, is a storage solution that lives independently of the computational resources and can be accessed from all such resources at a centre. Key features include the ability to access the same filesystem the same way on all computational resources at a centre, and a unified structure and nomenclature for all centra. Unlike cluster storage which is tightly associated with a single cluster, and thus has a limited life-time, centre storage does not require the users to migrate their own data when clusters are decommissioned, not even when the storage hardware itself is being replaced.<br />
<br />
== Unified environment ==<br />
To make the usage more transparent for SNIC users, a set of environment variables are available on all SNIC resources:<br />
<br />
* <code>SNIC_BACKUP</code> – the user's primary directory at the centre<br>(the part of the centre storage that is backed up)<br />
* <code>SNIC_NOBACKUP</code> – recommended directory for project storage without backup<br>(also on the centre storage)<br />
* <code>SNIC_TMP</code> – recommended directory for best performance during a job<br>(local disk on nodes if applicable)</div>Tom Langborg (NSC)https://snicdocs.nsc.liu.se/w/index.php?title=Swestore-irods&diff=5484Swestore-irods2013-10-30T15:46:00Z<p>Tom Langborg (NSC): /* Access protocols */</p>
<hr />
<div>[[Category:Storage]]<br />
[[Category:SweStore]]<br />
<br />
'''This is not official yet'''<br />
<br />
SNIC is building a storage infrastructure to complement the computational resources.<br />
<br />
Many forms of automated measurements can produce large amounts of data. In scientific areas such as high energy physics (the Large Hadron Collider at CERN), climate modeling, bioinformatics, bioimaging etc., the demands for storage are increasing dramatically. To serve these and other user communities, SNIC has appointed a working group to design a storage strategy, taking into account the needs on many levels and creating a unified storage infrastructure, which is now being implemented.<br />
<br />
Swestore is in collaboration with [http://www.ecds.se/ ECDS], [http://snd.gu.se/ SND], Bioimage Sweden, [http://www.bils.se/ BILS], [http://www.uppnex.uu.se/ UPPNEX],[http://wlcg.web.cern.ch/ WLCG], [http://www.nrm.se/ NaturHistoriska RiksMuseet].<br />
<br />
= National storage =<br />
The Swestore Nationally Accessible Storage, commonly called just Swestore, is a robust, flexible and expandable long<br />
term storage system aimed at storing large amounts of data produced by various Swedish research projects. It is based on the [http://www.dcache.org dCache] and [http://www.irods.org iRODS]<br />
storage system and is distributed across the SNIC centres [http://www.c3se.chalmers.se/ C3SE], [http://www.hpc2n.umu.se/ HPC2N], [http://www.lunarc.lu.se/ Lunarc],<br />
[http://www.nsc.liu.se/ NSC], [http://www.pdc.kth.se PDC] and [http://www.uppmax.uu.se Uppmax].<br />
<br />
Data is stored in two copies with each copy at a different SNIC centre. This enables the system to cope with a multitude of issues ranging from a simple<br />
crash of a storage element to losing an entire site while stil providing access to the stored data. To protect against silent data corruption the<br />
dCache storage system checksums all stored data and periodically verifies the data using this checksum.<br />
<br />
The system does NOT yet provide protection against user errors like inadvertent file deletions and so on.<br />
<br />
One of the major advantages to the distributed nature of dCache is the excellent aggregated transfer rates possible. This is achieved by bypassing a central node<br />
and having transfers going directly to/from the storage elements if the protocol allows it.<br />
The Swestore Nationally Accessible Storage system can achieve aggregated transfer rates<br />
in excess of 100 Gigabit per second, but in practice this is limited by connectivity to each University (usually 10 Gbit/s) or a limited number of files (typically<br />
max 1 Gbit/s per file/connection).<br />
<br />
==Access protocols==<br />
; Currently supported protocols<br />
: GridFTP - gsiftp://gsiftp.swestore.se/<br />
: Storage Resource Manager - srm://srm.swegrid.se/<br />
: Hypertext Transfer Protocol (read-only), Web Distributed Authoring and Versioning - http://webdav.swestore.se/ (unauthenticated), https://webdav.swestore.se/<br />
: NFS4.1<br />
: iRODS with yubikey as authentication <br />
<br />
For authentication eScience certificates are used, which provides a higher level of security than legacy username/password schemes.<br />
<br />
== Getting access ==<br />
; Apply for storage<br />
: Please follow the instructions on the [[Apply for storage on SweStore]] page.<br />
<br />
===dCache===<br />
; Acquire an eScience client certificate<br />
: Follow the instructions on [[Grid_certificates#Requesting_a_certificate|Requesting a certificate]] to get your client certificate. This step can be performed while waiting for the storage application to be approved and processed. Of course, if you already have a valid eScience certificate you don't need to acquire another one.<br />
:; For Terena certificates<br />
:: If intending to access SweStore from a SNIC resource, please make sure you also [[Exporting_a_client_certificate|export the certificate]], transfer it to the intended SNIC resource and [[Preparing_a_client_certificate|prepare it for use with grid tools]] (not necessarily needed with ARC 3.x, see [[Grid_certificates#Creating_a_proxy_certificate_using_the_Firefox.2FThunderbird_credential_store|proxy certificates using Firefox credential store]]).<br />
:; For Nordugrid certificates<br />
:: Please make sure to also [[Requesting_a_grid_certificate_from_the_Nordugrid_CA#Installing_the_certificate_in_your_browser|install your client certificate in your browser]].<br />
; Request membership in the SweGrid VO<br />
: Follow the instructions on [[Grid_certificates#Requesting_membership_in_the_SweGrid_VO|Requesting membership in the SweGrid VO]] to get added to the SweGrid Virtual Organisation (VO) and request membership to your allocated storage project.<br />
<br />
==== Download and upload data ====<br />
; Interactive browsing and manipulation of single files<br />
: SweStore is accessible in your web browser in two ways, as a directory index interface at https://webdav.swestore.se/ and with an interactive file manager at https://webdav.swestore.se/browser/. '''Note''' that the interactive file manager has a lot of features and functions not supported in SweStore, only the basic file transfer features are supported.<br />
: To browse private data you need to have your certificate installed in your browser (default with Terena certificates, see above). Projects are organized under the <code>/snic</code> directory as <code><nowiki>https://webdav.swestore.se/snic/YOUR_PROJECT_NAME/</nowiki></code>.<br />
; Upload and delete data interactively or with automation<br />
There are several tools that are capable of using the protocols provided by SweStore national storage.<br />
For interactive usage on SNIC clusters we recommend using the ARC tools which should be installed on all SNIC resources.<br />
As an integration point for building scripts and automated systems we suggest using the curl program and library.<br />
: Use the ARC client. Please see the instructions for [[Accessing SweStore national storage with the ARC client]]. '''Recommended''' method when logged in on SNIC resources.<br />
: Use lftp. Please see the instructions for [[Accessing SweStore national storage with lftp]].<br />
: Use cURL. Please see the instructions for [[Accessing SweStore national storage with cURL]].<br />
: Use globus-url-copy. Please see the instructions for [[Accessing SweStore national storage with globus-url-copy]].<br />
<br />
=== iRODS===<br />
<br />
====Download and upload data ====<br />
====Apply for YubiKey====<br />
<br />
== More information ==<br />
* [http://status.swestore.se/munin/monitor/monitor/ Per Project Monitoring of Swestore usage]<br />
<br />
If you have any issues using SweStore please do not hesitate to contact [mailto:support@swestore.se support@swestore.se].<br />
<br />
== Tools and scripts ==<br />
<br />
There exists a number of tools and utilities developed externally that can be useful. Here are some links:<br />
<br />
* [https://github.com/samuell/arc_tools ARC_Tools] - Convenience scripts for the arc client (Only a recursive rmdir so far).<br />
* [http://sourceforge.net/projects/arc-gui-clients ARC Graphical Clients] - Contains the ARC Storage Explorer (SweStore supported development).<br />
* Transfer script, [[SweStore/swetrans_arc|swetrans_arc]], provided by Adam Peplinski / Philipp Schlatter<br />
* [http://www.nordugrid.org/documents/SWIG-wrapped-ARC-Python-API.pdf Documentation of the ARC Python API (PDF)]<br />
<br />
== Slides and more ==<br />
<br />
[http://docs.snic.se/wiki/Swestore/Lund_Seminar_Apr18 Slides and material from seminar for Lund users on April 18th]<br />
<br />
= Center storage =<br />
Centre storage, as defined by the SNIC storage group, is a storage solution that lives independently of the computational resources and can be accessed from all such resources at a centre. Key features include the ability to access the same filesystem the same way on all computational resources at a centre, and a unified structure and nomenclature for all centra. Unlike cluster storage which is tightly associated with a single cluster, and thus has a limited life-time, centre storage does not require the users to migrate their own data when clusters are decommissioned, not even when the storage hardware itself is being replaced.<br />
<br />
== Unified environment ==<br />
To make the usage more transparent for SNIC users, a set of environment variables are available on all SNIC resources:<br />
<br />
* <code>SNIC_BACKUP</code> – the user's primary directory at the centre<br>(the part of the centre storage that is backed up)<br />
* <code>SNIC_NOBACKUP</code> – recommended directory for project storage without backup<br>(also on the centre storage)<br />
* <code>SNIC_TMP</code> – recommended directory for best performance during a job<br>(local disk on nodes if applicable)</div>Tom Langborg (NSC)https://snicdocs.nsc.liu.se/w/index.php?title=Swestore-irods&diff=5483Swestore-irods2013-10-30T15:43:27Z<p>Tom Langborg (NSC): /* Getting access */</p>
<hr />
<div>[[Category:Storage]]<br />
[[Category:SweStore]]<br />
<br />
'''This is not official yet'''<br />
<br />
SNIC is building a storage infrastructure to complement the computational resources.<br />
<br />
Many forms of automated measurements can produce large amounts of data. In scientific areas such as high energy physics (the Large Hadron Collider at CERN), climate modeling, bioinformatics, bioimaging etc., the demands for storage are increasing dramatically. To serve these and other user communities, SNIC has appointed a working group to design a storage strategy, taking into account the needs on many levels and creating a unified storage infrastructure, which is now being implemented.<br />
<br />
Swestore is in collaboration with [http://www.ecds.se/ ECDS], [http://snd.gu.se/ SND], Bioimage Sweden, [http://www.bils.se/ BILS], [http://www.uppnex.uu.se/ UPPNEX],[http://wlcg.web.cern.ch/ WLCG], [http://www.nrm.se/ NaturHistoriska RiksMuseet].<br />
<br />
= National storage =<br />
The Swestore Nationally Accessible Storage, commonly called just Swestore, is a robust, flexible and expandable long<br />
term storage system aimed at storing large amounts of data produced by various Swedish research projects. It is based on the [http://www.dcache.org dCache] and [http://www.irods.org iRODS]<br />
storage system and is distributed across the SNIC centres [http://www.c3se.chalmers.se/ C3SE], [http://www.hpc2n.umu.se/ HPC2N], [http://www.lunarc.lu.se/ Lunarc],<br />
[http://www.nsc.liu.se/ NSC], [http://www.pdc.kth.se PDC] and [http://www.uppmax.uu.se Uppmax].<br />
<br />
Data is stored in two copies with each copy at a different SNIC centre. This enables the system to cope with a multitude of issues ranging from a simple<br />
crash of a storage element to losing an entire site while stil providing access to the stored data. To protect against silent data corruption the<br />
dCache storage system checksums all stored data and periodically verifies the data using this checksum.<br />
<br />
The system does NOT yet provide protection against user errors like inadvertent file deletions and so on.<br />
<br />
One of the major advantages to the distributed nature of dCache is the excellent aggregated transfer rates possible. This is achieved by bypassing a central node<br />
and having transfers going directly to/from the storage elements if the protocol allows it.<br />
The Swestore Nationally Accessible Storage system can achieve aggregated transfer rates<br />
in excess of 100 Gigabit per second, but in practice this is limited by connectivity to each University (usually 10 Gbit/s) or a limited number of files (typically<br />
max 1 Gbit/s per file/connection).<br />
<br />
==Access protocols==<br />
; Currently supported protocols<br />
: GridFTP - gsiftp://gsiftp.swestore.se/<br />
: Storage Resource Manager - srm://srm.swegrid.se/<br />
: Hypertext Transfer Protocol (read-only), Web Distributed Authoring and Versioning - http://webdav.swestore.se/ (unauthenticated), https://webdav.swestore.se/<br />
: NFS4.1, iRODS<br />
: iRODS with yubikey as authentication <br />
<br />
For authentication eScience certificates are used, which provides a higher level of security than legacy username/password schemes.<br />
<br />
== Getting access ==<br />
; Apply for storage<br />
: Please follow the instructions on the [[Apply for storage on SweStore]] page.<br />
<br />
===dCache===<br />
; Acquire an eScience client certificate<br />
: Follow the instructions on [[Grid_certificates#Requesting_a_certificate|Requesting a certificate]] to get your client certificate. This step can be performed while waiting for the storage application to be approved and processed. Of course, if you already have a valid eScience certificate you don't need to acquire another one.<br />
:; For Terena certificates<br />
:: If intending to access SweStore from a SNIC resource, please make sure you also [[Exporting_a_client_certificate|export the certificate]], transfer it to the intended SNIC resource and [[Preparing_a_client_certificate|prepare it for use with grid tools]] (not necessarily needed with ARC 3.x, see [[Grid_certificates#Creating_a_proxy_certificate_using_the_Firefox.2FThunderbird_credential_store|proxy certificates using Firefox credential store]]).<br />
:; For Nordugrid certificates<br />
:: Please make sure to also [[Requesting_a_grid_certificate_from_the_Nordugrid_CA#Installing_the_certificate_in_your_browser|install your client certificate in your browser]].<br />
; Request membership in the SweGrid VO<br />
: Follow the instructions on [[Grid_certificates#Requesting_membership_in_the_SweGrid_VO|Requesting membership in the SweGrid VO]] to get added to the SweGrid Virtual Organisation (VO) and request membership to your allocated storage project.<br />
<br />
==== Download and upload data ====<br />
; Interactive browsing and manipulation of single files<br />
: SweStore is accessible in your web browser in two ways, as a directory index interface at https://webdav.swestore.se/ and with an interactive file manager at https://webdav.swestore.se/browser/. '''Note''' that the interactive file manager has a lot of features and functions not supported in SweStore, only the basic file transfer features are supported.<br />
: To browse private data you need to have your certificate installed in your browser (default with Terena certificates, see above). Projects are organized under the <code>/snic</code> directory as <code><nowiki>https://webdav.swestore.se/snic/YOUR_PROJECT_NAME/</nowiki></code>.<br />
; Upload and delete data interactively or with automation<br />
There are several tools that are capable of using the protocols provided by SweStore national storage.<br />
For interactive usage on SNIC clusters we recommend using the ARC tools which should be installed on all SNIC resources.<br />
As an integration point for building scripts and automated systems we suggest using the curl program and library.<br />
: Use the ARC client. Please see the instructions for [[Accessing SweStore national storage with the ARC client]]. '''Recommended''' method when logged in on SNIC resources.<br />
: Use lftp. Please see the instructions for [[Accessing SweStore national storage with lftp]].<br />
: Use cURL. Please see the instructions for [[Accessing SweStore national storage with cURL]].<br />
: Use globus-url-copy. Please see the instructions for [[Accessing SweStore national storage with globus-url-copy]].<br />
<br />
=== iRODS===<br />
<br />
====Download and upload data ====<br />
====Apply for YubiKey====<br />
<br />
== More information ==<br />
* [http://status.swestore.se/munin/monitor/monitor/ Per Project Monitoring of Swestore usage]<br />
<br />
If you have any issues using SweStore please do not hesitate to contact [mailto:support@swestore.se support@swestore.se].<br />
<br />
== Tools and scripts ==<br />
<br />
There exists a number of tools and utilities developed externally that can be useful. Here are some links:<br />
<br />
* [https://github.com/samuell/arc_tools ARC_Tools] - Convenience scripts for the arc client (Only a recursive rmdir so far).<br />
* [http://sourceforge.net/projects/arc-gui-clients ARC Graphical Clients] - Contains the ARC Storage Explorer (SweStore supported development).<br />
* Transfer script, [[SweStore/swetrans_arc|swetrans_arc]], provided by Adam Peplinski / Philipp Schlatter<br />
* [http://www.nordugrid.org/documents/SWIG-wrapped-ARC-Python-API.pdf Documentation of the ARC Python API (PDF)]<br />
<br />
== Slides and more ==<br />
<br />
[http://docs.snic.se/wiki/Swestore/Lund_Seminar_Apr18 Slides and material from seminar for Lund users on April 18th]<br />
<br />
= Center storage =<br />
Centre storage, as defined by the SNIC storage group, is a storage solution that lives independently of the computational resources and can be accessed from all such resources at a centre. Key features include the ability to access the same filesystem the same way on all computational resources at a centre, and a unified structure and nomenclature for all centra. Unlike cluster storage which is tightly associated with a single cluster, and thus has a limited life-time, centre storage does not require the users to migrate their own data when clusters are decommissioned, not even when the storage hardware itself is being replaced.<br />
<br />
== Unified environment ==<br />
To make the usage more transparent for SNIC users, a set of environment variables are available on all SNIC resources:<br />
<br />
* <code>SNIC_BACKUP</code> – the user's primary directory at the centre<br>(the part of the centre storage that is backed up)<br />
* <code>SNIC_NOBACKUP</code> – recommended directory for project storage without backup<br>(also on the centre storage)<br />
* <code>SNIC_TMP</code> – recommended directory for best performance during a job<br>(local disk on nodes if applicable)</div>Tom Langborg (NSC)https://snicdocs.nsc.liu.se/w/index.php?title=Apply_for_storage_on_Swestore&diff=5482Apply for storage on Swestore2013-10-30T15:17:39Z<p>Tom Langborg (NSC): </p>
<hr />
<div>[[Category:SweStore]]<br />
[[Category:SweStore user guide]]<br />
[[SweStore|< SweStore]]<br />
<br />
The SweStore nationally accessible storage is available for researchers financed by VR (which includes all researchers using SNIC compute resources) and FORMA.<br />
<br />
SweStore is also in collaboration with [http://www.ecds.se/ ECDS], [http://snd.gu.se/ SND], Bioimage, [http://www.bils.se/ BILS], [http://www.uppnex.uu.se/ UPPNEX],[http://wlcg.web.cern.ch/ WLCG], [http://www.nrm.se/ Naturhistoriska Riksmuseet]. If any of these cover your research area, first read their information on applying for SweStore storage.<br />
<br />
In the future, applications for storage will be handled by each research community, but for now an email to [mailto:support@swestore.se support@swestore.se] will suffice. <br />
<br />
Please include the following information in the application:<br><br />
'''From 1/11 can you apply for dCache storage or iRODS storage.'''<br />
* Type of storage dCache/iRODS<br />
* Name of the principal investigator (PI), including email address.<br />
* Purpose for the storage: A short description of the project and type of data.<br />
* Required storage capacity: Preferably a maximum size, but if this is not currently determinable, please calculate a starting size and expansion by time period. '''NOTE''' that applications larger than 10TB takes longer to process.<br />
* Suggested project name: This will be used as root directory name for your storage.<br />
# '''NOTE''' that this name is long-lived and will persist. It is not coupled to the lifetime of SNIC compute time allocations.<br />
# We recommend a project name not tied to a person.<br />
# Additionally, we recommend that the name is not a common word or term easily confusable with other current or future research efforts.<br />
# It is a good idea to select a name that's short and easy to type.<br />
# The name is limited to lower-case letters a-z, digits 0-9, hyphens - and underscores _.</div>Tom Langborg (NSC)https://snicdocs.nsc.liu.se/w/index.php?title=Swestore-irods&diff=5481Swestore-irods2013-10-30T14:59:02Z<p>Tom Langborg (NSC): </p>
<hr />
<div>[[Category:Storage]]<br />
[[Category:SweStore]]<br />
<br />
'''This is not official yet'''<br />
<br />
SNIC is building a storage infrastructure to complement the computational resources.<br />
<br />
Many forms of automated measurements can produce large amounts of data. In scientific areas such as high energy physics (the Large Hadron Collider at CERN), climate modeling, bioinformatics, bioimaging etc., the demands for storage are increasing dramatically. To serve these and other user communities, SNIC has appointed a working group to design a storage strategy, taking into account the needs on many levels and creating a unified storage infrastructure, which is now being implemented.<br />
<br />
Swestore is in collaboration with [http://www.ecds.se/ ECDS], [http://snd.gu.se/ SND], Bioimage Sweden, [http://www.bils.se/ BILS], [http://www.uppnex.uu.se/ UPPNEX],[http://wlcg.web.cern.ch/ WLCG], [http://www.nrm.se/ NaturHistoriska RiksMuseet].<br />
<br />
= National storage =<br />
The Swestore Nationally Accessible Storage, commonly called just Swestore, is a robust, flexible and expandable long<br />
term storage system aimed at storing large amounts of data produced by various Swedish research projects. It is based on the [http://www.dcache.org dCache] and [http://www.irods.org iRODS]<br />
storage system and is distributed across the SNIC centres [http://www.c3se.chalmers.se/ C3SE], [http://www.hpc2n.umu.se/ HPC2N], [http://www.lunarc.lu.se/ Lunarc],<br />
[http://www.nsc.liu.se/ NSC], [http://www.pdc.kth.se PDC] and [http://www.uppmax.uu.se Uppmax].<br />
<br />
Data is stored in two copies with each copy at a different SNIC centre. This enables the system to cope with a multitude of issues ranging from a simple<br />
crash of a storage element to losing an entire site while stil providing access to the stored data. To protect against silent data corruption the<br />
dCache storage system checksums all stored data and periodically verifies the data using this checksum.<br />
<br />
The system does NOT yet provide protection against user errors like inadvertent file deletions and so on.<br />
<br />
One of the major advantages to the distributed nature of dCache is the excellent aggregated transfer rates possible. This is achieved by bypassing a central node<br />
and having transfers going directly to/from the storage elements if the protocol allows it.<br />
The Swestore Nationally Accessible Storage system can achieve aggregated transfer rates<br />
in excess of 100 Gigabit per second, but in practice this is limited by connectivity to each University (usually 10 Gbit/s) or a limited number of files (typically<br />
max 1 Gbit/s per file/connection).<br />
<br />
==Access protocols==<br />
; Currently supported protocols<br />
: GridFTP - gsiftp://gsiftp.swestore.se/<br />
: Storage Resource Manager - srm://srm.swegrid.se/<br />
: Hypertext Transfer Protocol (read-only), Web Distributed Authoring and Versioning - http://webdav.swestore.se/ (unauthenticated), https://webdav.swestore.se/<br />
: NFS4.1, iRODS<br />
: iRODS with yubikey as authentication <br />
<br />
For authentication eScience certificates are used, which provides a higher level of security than legacy username/password schemes.<br />
<br />
== Getting access ==<br />
; Apply for storage<br />
: Please follow the instructions on the [[Apply for storage on SweStore]] page.<br />
<br />
===dCache===<br />
; Acquire an eScience client certificate<br />
: Follow the instructions on [[Grid_certificates#Requesting_a_certificate|Requesting a certificate]] to get your client certificate. This step can be performed while waiting for the storage application to be approved and processed. Of course, if you already have a valid eScience certificate you don't need to acquire another one.<br />
:; For Terena certificates<br />
:: If intending to access SweStore from a SNIC resource, please make sure you also [[Exporting_a_client_certificate|export the certificate]], transfer it to the intended SNIC resource and [[Preparing_a_client_certificate|prepare it for use with grid tools]] (not necessarily needed with ARC 3.x, see [[Grid_certificates#Creating_a_proxy_certificate_using_the_Firefox.2FThunderbird_credential_store|proxy certificates using Firefox credential store]]).<br />
:; For Nordugrid certificates<br />
:: Please make sure to also [[Requesting_a_grid_certificate_from_the_Nordugrid_CA#Installing_the_certificate_in_your_browser|install your client certificate in your browser]].<br />
; Request membership in the SweGrid VO<br />
: Follow the instructions on [[Grid_certificates#Requesting_membership_in_the_SweGrid_VO|Requesting membership in the SweGrid VO]] to get added to the SweGrid Virtual Organisation (VO) and request membership to your allocated storage project.<br />
<br />
==== Download and upload data ====<br />
; Interactive browsing and manipulation of single files<br />
: SweStore is accessible in your web browser in two ways, as a directory index interface at https://webdav.swestore.se/ and with an interactive file manager at https://webdav.swestore.se/browser/. '''Note''' that the interactive file manager has a lot of features and functions not supported in SweStore, only the basic file transfer features are supported.<br />
: To browse private data you need to have your certificate installed in your browser (default with Terena certificates, see above). Projects are organized under the <code>/snic</code> directory as <code><nowiki>https://webdav.swestore.se/snic/YOUR_PROJECT_NAME/</nowiki></code>.<br />
; Upload and delete data interactively or with automation<br />
There are several tools that are capable of using the protocols provided by SweStore national storage.<br />
For interactive usage on SNIC clusters we recommend using the ARC tools which should be installed on all SNIC resources.<br />
As an integration point for building scripts and automated systems we suggest using the curl program and library.<br />
: Use the ARC client. Please see the instructions for [[Accessing SweStore national storage with the ARC client]]. '''Recommended''' method when logged in on SNIC resources.<br />
: Use lftp. Please see the instructions for [[Accessing SweStore national storage with lftp]].<br />
: Use cURL. Please see the instructions for [[Accessing SweStore national storage with cURL]].<br />
: Use globus-url-copy. Please see the instructions for [[Accessing SweStore national storage with globus-url-copy]].<br />
<br />
=== iRODS===<br />
<br />
== More information ==<br />
* [http://status.swestore.se/munin/monitor/monitor/ Per Project Monitoring of Swestore usage]<br />
<br />
If you have any issues using SweStore please do not hesitate to contact [mailto:support@swestore.se support@swestore.se].<br />
<br />
== Tools and scripts ==<br />
<br />
There exists a number of tools and utilities developed externally that can be useful. Here are some links:<br />
<br />
* [https://github.com/samuell/arc_tools ARC_Tools] - Convenience scripts for the arc client (Only a recursive rmdir so far).<br />
* [http://sourceforge.net/projects/arc-gui-clients ARC Graphical Clients] - Contains the ARC Storage Explorer (SweStore supported development).<br />
* Transfer script, [[SweStore/swetrans_arc|swetrans_arc]], provided by Adam Peplinski / Philipp Schlatter<br />
* [http://www.nordugrid.org/documents/SWIG-wrapped-ARC-Python-API.pdf Documentation of the ARC Python API (PDF)]<br />
<br />
== Slides and more ==<br />
<br />
[http://docs.snic.se/wiki/Swestore/Lund_Seminar_Apr18 Slides and material from seminar for Lund users on April 18th]<br />
<br />
= Center storage =<br />
Centre storage, as defined by the SNIC storage group, is a storage solution that lives independently of the computational resources and can be accessed from all such resources at a centre. Key features include the ability to access the same filesystem the same way on all computational resources at a centre, and a unified structure and nomenclature for all centra. Unlike cluster storage which is tightly associated with a single cluster, and thus has a limited life-time, centre storage does not require the users to migrate their own data when clusters are decommissioned, not even when the storage hardware itself is being replaced.<br />
<br />
== Unified environment ==<br />
To make the usage more transparent for SNIC users, a set of environment variables are available on all SNIC resources:<br />
<br />
* <code>SNIC_BACKUP</code> – the user's primary directory at the centre<br>(the part of the centre storage that is backed up)<br />
* <code>SNIC_NOBACKUP</code> – recommended directory for project storage without backup<br>(also on the centre storage)<br />
* <code>SNIC_TMP</code> – recommended directory for best performance during a job<br>(local disk on nodes if applicable)</div>Tom Langborg (NSC)https://snicdocs.nsc.liu.se/w/index.php?title=Swestore-irods&diff=5480Swestore-irods2013-10-30T14:52:49Z<p>Tom Langborg (NSC): </p>
<hr />
<div>[[Category:Storage]]<br />
[[Category:SweStore]]<br />
<br />
'''This is not official yet'''<br />
SNIC is building a storage infrastructure to complement the computational resources.<br />
<br />
Many forms of automated measurements can produce large amounts of data. In scientific areas such as high energy physics (the Large Hadron Collider at CERN), climate modeling, bioinformatics, bioimaging etc., the demands for storage are increasing dramatically. To serve these and other user communities, SNIC has appointed a working group to design a storage strategy, taking into account the needs on many levels and creating a unified storage infrastructure, which is now being implemented.<br />
<br />
Swestore is in collaboration with [http://www.ecds.se/ ECDS], [http://snd.gu.se/ SND], Bioimage Sweden, [http://www.bils.se/ BILS], [http://www.uppnex.uu.se/ UPPNEX],[http://wlcg.web.cern.ch/ WLCG], [http://www.nrm.se/ NaturHistoriska RiksMuseet].<br />
<br />
= National storage =<br />
The Swestore Nationally Accessible Storage, commonly called just Swestore, is a robust, flexible and expandable long<br />
term storage system aimed at storing large amounts of data produced by various Swedish research projects. It is based on the [http://www.dcache.org dCache] and [http://www.irods.org iRODS]<br />
storage system and is distributed across the SNIC centres [http://www.c3se.chalmers.se/ C3SE], [http://www.hpc2n.umu.se/ HPC2N], [http://www.lunarc.lu.se/ Lunarc],<br />
[http://www.nsc.liu.se/ NSC], [http://www.pdc.kth.se PDC] and [http://www.uppmax.uu.se Uppmax].<br />
<br />
Data is stored in two copies with each copy at a different SNIC centre. This enables the system to cope with a multitude of issues ranging from a simple<br />
crash of a storage element to losing an entire site while stil providing access to the stored data. To protect against silent data corruption the<br />
dCache storage system checksums all stored data and periodically verifies the data using this checksum.<br />
<br />
The system does NOT yet provide protection against user errors like inadvertent file deletions and so on.<br />
<br />
One of the major advantages to the distributed nature of dCache is the excellent aggregated transfer rates possible. This is achieved by bypassing a central node<br />
and having transfers going directly to/from the storage elements if the protocol allows it.<br />
The Swestore Nationally Accessible Storage system can achieve aggregated transfer rates<br />
in excess of 100 Gigabit per second, but in practice this is limited by connectivity to each University (usually 10 Gbit/s) or a limited number of files (typically<br />
max 1 Gbit/s per file/connection).<br />
<br />
==Access protocols==<br />
; Currently supported protocols<br />
: GridFTP - gsiftp://gsiftp.swestore.se/<br />
: Storage Resource Manager - srm://srm.swegrid.se/<br />
: Hypertext Transfer Protocol (read-only), Web Distributed Authoring and Versioning - http://webdav.swestore.se/ (unauthenticated), https://webdav.swestore.se/<br />
: NFS4.1, iRODS<br />
: iRODS with yubikey as authentication <br />
<br />
For authentication eScience certificates are used, which provides a higher level of security than legacy username/password schemes.<br />
<br />
== Getting access ==<br />
; Apply for storage<br />
: Please follow the instructions on the [[Apply for storage on SweStore]] page.<br />
; Acquire an eScience client certificate<br />
: Follow the instructions on [[Grid_certificates#Requesting_a_certificate|Requesting a certificate]] to get your client certificate. This step can be performed while waiting for the storage application to be approved and processed. Of course, if you already have a valid eScience certificate you don't need to acquire another one.<br />
:; For Terena certificates<br />
:: If intending to access SweStore from a SNIC resource, please make sure you also [[Exporting_a_client_certificate|export the certificate]], transfer it to the intended SNIC resource and [[Preparing_a_client_certificate|prepare it for use with grid tools]] (not necessarily needed with ARC 3.x, see [[Grid_certificates#Creating_a_proxy_certificate_using_the_Firefox.2FThunderbird_credential_store|proxy certificates using Firefox credential store]]).<br />
:; For Nordugrid certificates<br />
:: Please make sure to also [[Requesting_a_grid_certificate_from_the_Nordugrid_CA#Installing_the_certificate_in_your_browser|install your client certificate in your browser]].<br />
; Request membership in the SweGrid VO<br />
: Follow the instructions on [[Grid_certificates#Requesting_membership_in_the_SweGrid_VO|Requesting membership in the SweGrid VO]] to get added to the SweGrid Virtual Organisation (VO) and request membership to your allocated storage project.<br />
<br />
== Download and upload data ==<br />
; Interactive browsing and manipulation of single files<br />
: SweStore is accessible in your web browser in two ways, as a directory index interface at https://webdav.swestore.se/ and with an interactive file manager at https://webdav.swestore.se/browser/. '''Note''' that the interactive file manager has a lot of features and functions not supported in SweStore, only the basic file transfer features are supported.<br />
: To browse private data you need to have your certificate installed in your browser (default with Terena certificates, see above). Projects are organized under the <code>/snic</code> directory as <code><nowiki>https://webdav.swestore.se/snic/YOUR_PROJECT_NAME/</nowiki></code>.<br />
; Upload and delete data interactively or with automation<br />
There are several tools that are capable of using the protocols provided by SweStore national storage.<br />
For interactive usage on SNIC clusters we recommend using the ARC tools which should be installed on all SNIC resources.<br />
As an integration point for building scripts and automated systems we suggest using the curl program and library.<br />
: Use the ARC client. Please see the instructions for [[Accessing SweStore national storage with the ARC client]]. '''Recommended''' method when logged in on SNIC resources.<br />
: Use lftp. Please see the instructions for [[Accessing SweStore national storage with lftp]].<br />
: Use cURL. Please see the instructions for [[Accessing SweStore national storage with cURL]].<br />
: Use globus-url-copy. Please see the instructions for [[Accessing SweStore national storage with globus-url-copy]].<br />
<br />
== More information ==<br />
* [http://status.swestore.se/munin/monitor/monitor/ Per Project Monitoring of Swestore usage]<br />
<br />
If you have any issues using SweStore please do not hesitate to contact [mailto:support@swestore.se support@swestore.se].<br />
<br />
== Tools and scripts ==<br />
<br />
There exists a number of tools and utilities developed externally that can be useful. Here are some links:<br />
<br />
* [https://github.com/samuell/arc_tools ARC_Tools] - Convenience scripts for the arc client (Only a recursive rmdir so far).<br />
* [http://sourceforge.net/projects/arc-gui-clients ARC Graphical Clients] - Contains the ARC Storage Explorer (SweStore supported development).<br />
* Transfer script, [[SweStore/swetrans_arc|swetrans_arc]], provided by Adam Peplinski / Philipp Schlatter<br />
* [http://www.nordugrid.org/documents/SWIG-wrapped-ARC-Python-API.pdf Documentation of the ARC Python API (PDF)]<br />
<br />
== Slides and more ==<br />
<br />
[http://docs.snic.se/wiki/Swestore/Lund_Seminar_Apr18 Slides and material from seminar for Lund users on April 18th]<br />
<br />
= Center storage =<br />
Centre storage, as defined by the SNIC storage group, is a storage solution that lives independently of the computational resources and can be accessed from all such resources at a centre. Key features include the ability to access the same filesystem the same way on all computational resources at a centre, and a unified structure and nomenclature for all centra. Unlike cluster storage which is tightly associated with a single cluster, and thus has a limited life-time, centre storage does not require the users to migrate their own data when clusters are decommissioned, not even when the storage hardware itself is being replaced.<br />
<br />
== Unified environment ==<br />
To make the usage more transparent for SNIC users, a set of environment variables are available on all SNIC resources:<br />
<br />
* <code>SNIC_BACKUP</code> – the user's primary directory at the centre<br>(the part of the centre storage that is backed up)<br />
* <code>SNIC_NOBACKUP</code> – recommended directory for project storage without backup<br>(also on the centre storage)<br />
* <code>SNIC_TMP</code> – recommended directory for best performance during a job<br>(local disk on nodes if applicable)</div>Tom Langborg (NSC)https://snicdocs.nsc.liu.se/w/index.php?title=Swestore-irods&diff=5479Swestore-irods2013-10-30T14:50:33Z<p>Tom Langborg (NSC): Created page with "Category:Storage Category:SweStore SNIC is building a storage infrastructure to complement the computational resources. Many forms of automated measurements can produce ..."</p>
<hr />
<div>[[Category:Storage]]<br />
[[Category:SweStore]]<br />
SNIC is building a storage infrastructure to complement the computational resources.<br />
<br />
Many forms of automated measurements can produce large amounts of data. In scientific areas such as high energy physics (the Large Hadron Collider at CERN), climate modeling, bioinformatics, bioimaging etc., the demands for storage are increasing dramatically. To serve these and other user communities, SNIC has appointed a working group to design a storage strategy, taking into account the needs on many levels and creating a unified storage infrastructure, which is now being implemented.<br />
<br />
Swestore is in collaboration with [http://www.ecds.se/ ECDS], [http://snd.gu.se/ SND], Bioimage Sweden, [http://www.bils.se/ BILS], [http://www.uppnex.uu.se/ UPPNEX],[http://wlcg.web.cern.ch/ WLCG], [http://www.nrm.se/ NaturHistoriska RiksMuseet].<br />
<br />
= National storage =<br />
The Swestore Nationally Accessible Storage, commonly called just Swestore, is a robust, flexible and expandable long<br />
term storage system aimed at storing large amounts of data produced by various Swedish research projects. It is based on the [http://www.dcache.org dCache] and [http://www.irods.org iRODS]<br />
storage system and is distributed across the SNIC centres [http://www.c3se.chalmers.se/ C3SE], [http://www.hpc2n.umu.se/ HPC2N], [http://www.lunarc.lu.se/ Lunarc],<br />
[http://www.nsc.liu.se/ NSC], [http://www.pdc.kth.se PDC] and [http://www.uppmax.uu.se Uppmax].<br />
<br />
Data is stored in two copies with each copy at a different SNIC centre. This enables the system to cope with a multitude of issues ranging from a simple<br />
crash of a storage element to losing an entire site while stil providing access to the stored data. To protect against silent data corruption the<br />
dCache storage system checksums all stored data and periodically verifies the data using this checksum.<br />
<br />
The system does NOT yet provide protection against user errors like inadvertent file deletions and so on.<br />
<br />
One of the major advantages to the distributed nature of dCache is the excellent aggregated transfer rates possible. This is achieved by bypassing a central node<br />
and having transfers going directly to/from the storage elements if the protocol allows it.<br />
The Swestore Nationally Accessible Storage system can achieve aggregated transfer rates<br />
in excess of 100 Gigabit per second, but in practice this is limited by connectivity to each University (usually 10 Gbit/s) or a limited number of files (typically<br />
max 1 Gbit/s per file/connection).<br />
<br />
==Access protocols==<br />
; Currently supported protocols<br />
: GridFTP - gsiftp://gsiftp.swestore.se/<br />
: Storage Resource Manager - srm://srm.swegrid.se/<br />
: Hypertext Transfer Protocol (read-only), Web Distributed Authoring and Versioning - http://webdav.swestore.se/ (unauthenticated), https://webdav.swestore.se/<br />
: NFS4.1, iRODS<br />
: iRODS with yubikey as authentication <br />
<br />
For authentication eScience certificates are used, which provides a higher level of security than legacy username/password schemes.<br />
<br />
== Getting access ==<br />
; Apply for storage<br />
: Please follow the instructions on the [[Apply for storage on SweStore]] page.<br />
; Acquire an eScience client certificate<br />
: Follow the instructions on [[Grid_certificates#Requesting_a_certificate|Requesting a certificate]] to get your client certificate. This step can be performed while waiting for the storage application to be approved and processed. Of course, if you already have a valid eScience certificate you don't need to acquire another one.<br />
:; For Terena certificates<br />
:: If intending to access SweStore from a SNIC resource, please make sure you also [[Exporting_a_client_certificate|export the certificate]], transfer it to the intended SNIC resource and [[Preparing_a_client_certificate|prepare it for use with grid tools]] (not necessarily needed with ARC 3.x, see [[Grid_certificates#Creating_a_proxy_certificate_using_the_Firefox.2FThunderbird_credential_store|proxy certificates using Firefox credential store]]).<br />
:; For Nordugrid certificates<br />
:: Please make sure to also [[Requesting_a_grid_certificate_from_the_Nordugrid_CA#Installing_the_certificate_in_your_browser|install your client certificate in your browser]].<br />
; Request membership in the SweGrid VO<br />
: Follow the instructions on [[Grid_certificates#Requesting_membership_in_the_SweGrid_VO|Requesting membership in the SweGrid VO]] to get added to the SweGrid Virtual Organisation (VO) and request membership to your allocated storage project.<br />
<br />
== Download and upload data ==<br />
; Interactive browsing and manipulation of single files<br />
: SweStore is accessible in your web browser in two ways, as a directory index interface at https://webdav.swestore.se/ and with an interactive file manager at https://webdav.swestore.se/browser/. '''Note''' that the interactive file manager has a lot of features and functions not supported in SweStore, only the basic file transfer features are supported.<br />
: To browse private data you need to have your certificate installed in your browser (default with Terena certificates, see above). Projects are organized under the <code>/snic</code> directory as <code><nowiki>https://webdav.swestore.se/snic/YOUR_PROJECT_NAME/</nowiki></code>.<br />
; Upload and delete data interactively or with automation<br />
There are several tools that are capable of using the protocols provided by SweStore national storage.<br />
For interactive usage on SNIC clusters we recommend using the ARC tools which should be installed on all SNIC resources.<br />
As an integration point for building scripts and automated systems we suggest using the curl program and library.<br />
: Use the ARC client. Please see the instructions for [[Accessing SweStore national storage with the ARC client]]. '''Recommended''' method when logged in on SNIC resources.<br />
: Use lftp. Please see the instructions for [[Accessing SweStore national storage with lftp]].<br />
: Use cURL. Please see the instructions for [[Accessing SweStore national storage with cURL]].<br />
: Use globus-url-copy. Please see the instructions for [[Accessing SweStore national storage with globus-url-copy]].<br />
<br />
== More information ==<br />
* [http://status.swestore.se/munin/monitor/monitor/ Per Project Monitoring of Swestore usage]<br />
<br />
If you have any issues using SweStore please do not hesitate to contact [mailto:support@swestore.se support@swestore.se].<br />
<br />
== Tools and scripts ==<br />
<br />
There exists a number of tools and utilities developed externally that can be useful. Here are some links:<br />
<br />
* [https://github.com/samuell/arc_tools ARC_Tools] - Convenience scripts for the arc client (Only a recursive rmdir so far).<br />
* [http://sourceforge.net/projects/arc-gui-clients ARC Graphical Clients] - Contains the ARC Storage Explorer (SweStore supported development).<br />
* Transfer script, [[SweStore/swetrans_arc|swetrans_arc]], provided by Adam Peplinski / Philipp Schlatter<br />
* [http://www.nordugrid.org/documents/SWIG-wrapped-ARC-Python-API.pdf Documentation of the ARC Python API (PDF)]<br />
<br />
== Slides and more ==<br />
<br />
[http://docs.snic.se/wiki/Swestore/Lund_Seminar_Apr18 Slides and material from seminar for Lund users on April 18th]<br />
<br />
= Center storage =<br />
Centre storage, as defined by the SNIC storage group, is a storage solution that lives independently of the computational resources and can be accessed from all such resources at a centre. Key features include the ability to access the same filesystem the same way on all computational resources at a centre, and a unified structure and nomenclature for all centra. Unlike cluster storage which is tightly associated with a single cluster, and thus has a limited life-time, centre storage does not require the users to migrate their own data when clusters are decommissioned, not even when the storage hardware itself is being replaced.<br />
<br />
== Unified environment ==<br />
To make the usage more transparent for SNIC users, a set of environment variables are available on all SNIC resources:<br />
<br />
* <code>SNIC_BACKUP</code> – the user's primary directory at the centre<br>(the part of the centre storage that is backed up)<br />
* <code>SNIC_NOBACKUP</code> – recommended directory for project storage without backup<br>(also on the centre storage)<br />
* <code>SNIC_TMP</code> – recommended directory for best performance during a job<br>(local disk on nodes if applicable)</div>Tom Langborg (NSC)https://snicdocs.nsc.liu.se/w/index.php?title=LUNARC_storage&diff=4732LUNARC storage2013-02-13T05:59:23Z<p>Tom Langborg (NSC): Created page with "{{resource info |description=LUNARC Swestore storage node of 400Tb |resource type=storage |centre=LUNARC |production=yes |commissioning date=September 2007 |decommissioning date=..."</p>
<hr />
<div>{{resource info<br />
|description=LUNARC Swestore storage node of 400Tb<br />
|resource type=storage<br />
|centre=LUNARC<br />
|production=yes<br />
|commissioning date=September 2007<br />
|decommissioning date=<br />
}}<br />
<br />
LUNARC contributes to the Swestore storage with 400Tb of storage.<br />
<br />
<!--<br />
== Links ==<br />
* [http://example.com System details]<br />
--></div>Tom Langborg (NSC)https://snicdocs.nsc.liu.se/w/index.php?title=C3SE_storage&diff=4730C3SE storage2013-02-13T05:57:22Z<p>Tom Langborg (NSC): Created page with "{{resource info |description=C3SE Swestore storage node of 400Tb |resource type=storage |centre=C3SE |production=yes |commissioning date=September 2007 |decommissioning date= }} ..."</p>
<hr />
<div>{{resource info<br />
|description=C3SE Swestore storage node of 400Tb<br />
|resource type=storage<br />
|centre=C3SE<br />
|production=yes<br />
|commissioning date=September 2007<br />
|decommissioning date=<br />
}}<br />
<br />
NSC contributes to the Swestore storage with 200Tb of storage.<br />
<br />
<!--<br />
== Links ==<br />
* [http://example.com System details]<br />
--></div>Tom Langborg (NSC)https://snicdocs.nsc.liu.se/w/index.php?title=UPPMAX_storage&diff=4728UPPMAX storage2013-02-13T05:55:39Z<p>Tom Langborg (NSC): Created page with "{{resource info |description=UPPMAX Swestore storage node of 200Tb |resource type=storage |centre=UPPMAX |production=yes |commissioning date=September 2007 |decommissioning date=..."</p>
<hr />
<div>{{resource info<br />
|description=UPPMAX Swestore storage node of 200Tb<br />
|resource type=storage<br />
|centre=UPPMAX<br />
|production=yes<br />
|commissioning date=September 2007<br />
|decommissioning date=<br />
}}<br />
<br />
NSC contributes to the Swestore storage with 200Tb of storage.<br />
<br />
<!--<br />
== Links ==<br />
* [http://example.com System details]<br />
--></div>Tom Langborg (NSC)https://snicdocs.nsc.liu.se/w/index.php?title=HPC2N_storage&diff=4727HPC2N storage2013-02-13T05:54:55Z<p>Tom Langborg (NSC): Created page with "{{resource info |description=HPC2N Swestore storage node of 400Tb |resource type=storage |centre=HPC2N |production=yes |commissioning date=September 2007 |decommissioning date= }..."</p>
<hr />
<div>{{resource info<br />
|description=HPC2N Swestore storage node of 400Tb<br />
|resource type=storage<br />
|centre=HPC2N<br />
|production=yes<br />
|commissioning date=September 2007<br />
|decommissioning date=<br />
}}<br />
<br />
NSC contributes to the Swestore storage with 200Tb of storage.<br />
<br />
<!--<br />
== Links ==<br />
* [http://example.com System details]<br />
--></div>Tom Langborg (NSC)https://snicdocs.nsc.liu.se/w/index.php?title=PDC_storage&diff=4726PDC storage2013-02-12T13:03:10Z<p>Tom Langborg (NSC): Created page with "{{resource info |description=PDC Swestore storage node of 200Tb |resource type=storage |centre=PDC |production=yes |commissioning date=September 2007 |decommissioning date= }} P..."</p>
<hr />
<div>{{resource info<br />
|description=PDC Swestore storage node of 200Tb<br />
|resource type=storage<br />
|centre=PDC<br />
|production=yes<br />
|commissioning date=September 2007<br />
|decommissioning date=<br />
}}<br />
<br />
PDC contributes to the Swestore storage with 200Tb of storage.<br />
<br />
<!--<br />
== Links ==<br />
* [http://example.com System details]<br />
--></div>Tom Langborg (NSC)https://snicdocs.nsc.liu.se/w/index.php?title=NSC_storage&diff=4725NSC storage2013-02-12T12:47:35Z<p>Tom Langborg (NSC): </p>
<hr />
<div>{{resource info<br />
|description=NSC Swestore storage node of 200Tb<br />
|resource type=storage<br />
|centre=NSC<br />
|production=yes<br />
|commissioning date=September 2007<br />
|decommissioning date=<br />
}}<br />
<br />
NSC contributes to the Swestore storage with 200Tb of storage.<br />
<br />
<!--<br />
== Links ==<br />
* [http://example.com System details]<br />
--></div>Tom Langborg (NSC)https://snicdocs.nsc.liu.se/w/index.php?title=NSC_storage&diff=4724NSC storage2013-02-12T12:46:49Z<p>Tom Langborg (NSC): </p>
<hr />
<div>{{resource info<br />
|description=NSC Swestore storage node of 200Tb<br />
|resource type=storage<br />
|centre=NSC<br />
|production=yes<br />
|commissioning date=September 2007<br />
|decommissioning date=<br />
}}<br />
<br />
NSC contributes to the Swestore storage with XTb of storage.<br />
<br />
<!--<br />
== Links ==<br />
* [http://example.com System details]<br />
--></div>Tom Langborg (NSC)https://snicdocs.nsc.liu.se/w/index.php?title=Apply_for_storage_on_Swestore&diff=4537Apply for storage on Swestore2012-10-03T15:53:46Z<p>Tom Langborg (NSC): </p>
<hr />
<div>[[Category:SweStore]]<br />
[[Category:SweStore user guide]]<br />
Swestore is in collaboration with [http://www.ecds.se ECDS], [http://snd.gu.se SND], Bioimage, [http://www.bils.se BILS], [http://www.uppnex.uu.se UPPNEX],[http://http://lcg.web.cern.ch/lcg/public/ WLCG], [http://www.nrm.se/ Naturhistoriska Riksmuseet]. If any of these cover your research area, first read their information on applying for SweStore storage. In the future, applications for storage will be handled by each research community, but for now an email to [mailto:swestore-support@snic.vr.se swestore-support@snic.vr.se] will suffice. <br />
<br />
Please include the following information:<br />
* Name of the principal investigator (PI).<br />
* Purpose for the storage: A short description of the project and type of data.<br />
* Required storage capacity: Preferably a maximum size, but if this is not currently determinable, please calculate a starting size and expansion by time period.<br />
* Suggested project name: This will be used as root directory name for your storage.</div>Tom Langborg (NSC)https://snicdocs.nsc.liu.se/w/index.php?title=Swestore-dCache&diff=4288Swestore-dCache2012-08-09T10:09:40Z<p>Tom Langborg (NSC): /* Supported access protocol */</p>
<hr />
<div>[[Category:Storage]]<br />
[[Category:SweStore]]<br />
SNIC is building a storage infrastructure to complement the computational resources.<br />
<br />
Many forms of automated measurements can produce large amounts of data. In scientific areas such as high energy physics (the Large Hadron Collider at CERN), climate modeling, bioinformatics, bioimaging etc., the demands for storage are increasing dramatically. To serve these and other user communities, SNIC has appointed a working group to design a storage strategy, taking into account the needs on many levels and creating a unified storage infrastructure, which is now being implemented.<br />
<br />
Swestore is in collaboration with [http://www.ecds.se ECDS], [http://snd.gu.se SND], Bioimage Sweden, [http://www.bils.se BILS], [http://www.uppnex.uu.se UPPNEX],[http://http://lcg.web.cern.ch/lcg/public/ WLCG], [http://www.nrm.se/ NaturHistoriska RiksMuseet].<br />
<br />
= National storage =<br />
The aim of the nationally accessible storage is to build a robust, flexible and expandable system that can<br />
be used in most cases where access to large scale storage is needed. To the user it should appear as a single large system,<br />
while it is desirable that some parts of the system are distributed across all SNIC centra to benefit from the advantages<br />
of, among other things, locality and cache effects. The system is intended as a versatile long-term storage system.<br />
<br />
==Supported access protocol==<br />
; Today SweStore support this protocols<br />
: srm://, gsiftp://, http:// (ro), https:// (ro), webdav (rw).<br />
; Coming to support this protocols<br />
: NFS4.1, iRODS<br />
<br />
== Getting access ==<br />
; Apply for storage<br />
: Please follow instructions [[Apply for storage on SweStore|here]]<br />
; Get a client certificate.<br />
: Follow the instructions [[Grid_certificates#Requesting_a_certificate|here]] to get your client certificate. For Terena certificates, please make sure you also [[Requesting_a_grid_certificate_using_the_Terena_eScience_Portal#Exporting Terena certificate for use with Grid tools|export the certificate for use with grid tools]]. For Nordugrid certificates, please make sure to also [[Requesting_a_grid_certificate_from_the_Nordugrid_CA#Installing_the_certificate_in_your_browser|install your client certificate in your browser]].<br />
; Request membership in the SweGrid VO.<br />
: Follow the instructions [[Grid_certificates#Requesting_membership_in_the_SweGrid_VO|here]] to get added to the SweGrid virtual organisation.<br />
<br />
== Download and upload data ==<br />
; Browse and download data<br />
: SweStore is accessible from your web browser, here http://webdav.swegrid.se/. To browse private data you must first install your certificate in your browser (see above). Your data is available at <code><nowiki>http://webdav.swegrid.se/snic/YOUR_PROJECT_NAME</nowiki></code>.<br />
; Upload and delete data<br />
: Use the ARC client. Please see the instructions for [[Accessing SweStore national storage with the ARC client]].<br />
: Use cURL. Please see the instructions for [[Accessing SweStore national storage with the cURL]].<br />
: Use lftp. Please see the instructions for [[Accessing SweStore national storage with lftp]].<br />
: Use globus-url-copy. Please see the instructions for [[Accessing SweStore national storage with globus-url-copy]].<br />
<br />
== Examples of storage projects ==<br />
Below are some examples of project that are using SweStore today.<br />
<br />
{|border="1" style="text-align:left; border-collapse: collapse; border-width: 1px; border-style: solid; border-color: #000" class="wikitable sortable" valign=top<br />
!Allocation name<br />
!Size in TB<br />
!class="unsortable"|Project full name<br />
|-<br />
|alice<br />
|400<br />
|<br />
|-<br />
|uppnex<br />
|140<br />
|[https://www.uppnex.uu.se UPPmax NExt Generation Sequencing Cluster & Storage]<br />
|-<br />
|brain_protein_atlas<br />
|10<br />
|Mouse brain protein atlas project<br />
|-<br />
| scims2lab<br />
|20<br />
| Identification of novel gene models by matching mass spectrometry data against 6-frame translations of the human genome<br />
|-<br />
|subatom<br />
|<br />
|Low-energy nuclear theory and experiment<br />
|-<br />
|genomics-gu<br />
|10<br />
|Genomics Core Facility, Sahlgrenska academy at University of Gothenburg.<br />
|-<br />
|Chemo<br />
|5TB<br />
|Genetic interaction networks in human deseas<br />
|-<br />
|cesm1_holocene<br />
|30<br />
|Arctic sea ice in warm climates<br />
|}<br />
<br />
== More information ==<br />
* [[SweStore introduction]]<br />
* [http://status.swestore.se/munin/monitor/monitor/ Per Project Monitoring of Swestore usage]<br />
* [[Accessing SweStore national storage with the ARC client]]<br />
<!-- * [[Mounting SweStore national storage via WebDAV|Mounting SweStore national storage via WebDAV (Not recomendated at the moment)]] --><br />
If you have any issues using SweStore please do not hesitate to contact [mailto:swestore-support@snic.vr.se swestore-support].<br />
<br />
== Tools and scripts ==<br />
<br />
There exists a number of tools and utilities developed externally that can be useful. Here are some links:<br />
<br />
* [https://github.com/samuell/arc_tools ARC_Tools] - Convenience scripts for the arc client (Only a recursive rmdir so far).<br />
* [http://sourceforge.net/projects/arc-gui-clients ARC Graphical Clients] - Contains the ARC Storage Explorer (SweStore supported development).<br />
* Transfer script, [http://snicdocs.nsc.liu.se/wiki/SweStore/swstrans_arc swetrans_arc], provided by Adam Peplinski / Philipp Schlatter<br />
* [http://www.nordugrid.org/documents/SWIG-wrapped-ARC-Python-API.pdf Documentation of the ARC Python API (PDF)]<br />
<br />
= Center storage =<br />
Centre storage, as defined by the SNIC storage group, is a storage solution that lives independently of the computational resources and can be accessed from all such resources at a centre. Key features include the ability to access the same filesystem the same way on all computational resources at a centre, and a unified structure and nomenclature for all centra. Unlike cluster storage which is tightly associated with a single cluster, and thus has a limited life-time, centre storage does not require the users to migrate their own data when clusters are decommissioned, not even when the storage hardware itself is being replaced.<br />
<br />
== Unified environment ==<br />
To make the usage more transparent for SNIC users, a set of environment variables are available on all SNIC resources:<br />
<br />
* <code>SNIC_BACKUP</code> – the user's primary directory at the centre<br>(the part of the centre storage that is backed up)<br />
* <code>SNIC_NOBACKUP</code> – recommended directory for project storage without backup<br>(also on the centre storage)<br />
* <code>SNIC_TMP</code> – recommended directory for best performance during a job<br>(local disk on nodes if applicable)</div>Tom Langborg (NSC)https://snicdocs.nsc.liu.se/w/index.php?title=Swestore-dCache&diff=4287Swestore-dCache2012-08-09T10:06:33Z<p>Tom Langborg (NSC): /* Getting access */</p>
<hr />
<div>[[Category:Storage]]<br />
[[Category:SweStore]]<br />
SNIC is building a storage infrastructure to complement the computational resources.<br />
<br />
Many forms of automated measurements can produce large amounts of data. In scientific areas such as high energy physics (the Large Hadron Collider at CERN), climate modeling, bioinformatics, bioimaging etc., the demands for storage are increasing dramatically. To serve these and other user communities, SNIC has appointed a working group to design a storage strategy, taking into account the needs on many levels and creating a unified storage infrastructure, which is now being implemented.<br />
<br />
Swestore is in collaboration with [http://www.ecds.se ECDS], [http://snd.gu.se SND], Bioimage Sweden, [http://www.bils.se BILS], [http://www.uppnex.uu.se UPPNEX],[http://http://lcg.web.cern.ch/lcg/public/ WLCG], [http://www.nrm.se/ NaturHistoriska RiksMuseet].<br />
<br />
= National storage =<br />
The aim of the nationally accessible storage is to build a robust, flexible and expandable system that can<br />
be used in most cases where access to large scale storage is needed. To the user it should appear as a single large system,<br />
while it is desirable that some parts of the system are distributed across all SNIC centra to benefit from the advantages<br />
of, among other things, locality and cache effects. The system is intended as a versatile long-term storage system.<br />
<br />
==Supported access protocol==<br />
; Today SweStore support this protocols<br />
: srm://, gsiftp://, http:// (ro), https:// (ro), webdav (rw).<br />
== Getting access ==<br />
; Apply for storage<br />
: Please follow instructions [[Apply for storage on SweStore|here]]<br />
; Get a client certificate.<br />
: Follow the instructions [[Grid_certificates#Requesting_a_certificate|here]] to get your client certificate. For Terena certificates, please make sure you also [[Requesting_a_grid_certificate_using_the_Terena_eScience_Portal#Exporting Terena certificate for use with Grid tools|export the certificate for use with grid tools]]. For Nordugrid certificates, please make sure to also [[Requesting_a_grid_certificate_from_the_Nordugrid_CA#Installing_the_certificate_in_your_browser|install your client certificate in your browser]].<br />
; Request membership in the SweGrid VO.<br />
: Follow the instructions [[Grid_certificates#Requesting_membership_in_the_SweGrid_VO|here]] to get added to the SweGrid virtual organisation.<br />
<br />
== Download and upload data ==<br />
; Browse and download data<br />
: SweStore is accessible from your web browser, here http://webdav.swegrid.se/. To browse private data you must first install your certificate in your browser (see above). Your data is available at <code><nowiki>http://webdav.swegrid.se/snic/YOUR_PROJECT_NAME</nowiki></code>.<br />
; Upload and delete data<br />
: Use the ARC client. Please see the instructions for [[Accessing SweStore national storage with the ARC client]].<br />
: Use cURL. Please see the instructions for [[Accessing SweStore national storage with the cURL]].<br />
: Use lftp. Please see the instructions for [[Accessing SweStore national storage with lftp]].<br />
: Use globus-url-copy. Please see the instructions for [[Accessing SweStore national storage with globus-url-copy]].<br />
<br />
== Examples of storage projects ==<br />
Below are some examples of project that are using SweStore today.<br />
<br />
{|border="1" style="text-align:left; border-collapse: collapse; border-width: 1px; border-style: solid; border-color: #000" class="wikitable sortable" valign=top<br />
!Allocation name<br />
!Size in TB<br />
!class="unsortable"|Project full name<br />
|-<br />
|alice<br />
|400<br />
|<br />
|-<br />
|uppnex<br />
|140<br />
|[https://www.uppnex.uu.se UPPmax NExt Generation Sequencing Cluster & Storage]<br />
|-<br />
|brain_protein_atlas<br />
|10<br />
|Mouse brain protein atlas project<br />
|-<br />
| scims2lab<br />
|20<br />
| Identification of novel gene models by matching mass spectrometry data against 6-frame translations of the human genome<br />
|-<br />
|subatom<br />
|<br />
|Low-energy nuclear theory and experiment<br />
|-<br />
|genomics-gu<br />
|10<br />
|Genomics Core Facility, Sahlgrenska academy at University of Gothenburg.<br />
|-<br />
|Chemo<br />
|5TB<br />
|Genetic interaction networks in human deseas<br />
|-<br />
|cesm1_holocene<br />
|30<br />
|Arctic sea ice in warm climates<br />
|}<br />
<br />
== More information ==<br />
* [[SweStore introduction]]<br />
* [http://status.swestore.se/munin/monitor/monitor/ Per Project Monitoring of Swestore usage]<br />
* [[Accessing SweStore national storage with the ARC client]]<br />
<!-- * [[Mounting SweStore national storage via WebDAV|Mounting SweStore national storage via WebDAV (Not recomendated at the moment)]] --><br />
If you have any issues using SweStore please do not hesitate to contact [mailto:swestore-support@snic.vr.se swestore-support].<br />
<br />
== Tools and scripts ==<br />
<br />
There exists a number of tools and utilities developed externally that can be useful. Here are some links:<br />
<br />
* [https://github.com/samuell/arc_tools ARC_Tools] - Convenience scripts for the arc client (Only a recursive rmdir so far).<br />
* [http://sourceforge.net/projects/arc-gui-clients ARC Graphical Clients] - Contains the ARC Storage Explorer (SweStore supported development).<br />
* Transfer script, [http://snicdocs.nsc.liu.se/wiki/SweStore/swstrans_arc swetrans_arc], provided by Adam Peplinski / Philipp Schlatter<br />
* [http://www.nordugrid.org/documents/SWIG-wrapped-ARC-Python-API.pdf Documentation of the ARC Python API (PDF)]<br />
<br />
= Center storage =<br />
Centre storage, as defined by the SNIC storage group, is a storage solution that lives independently of the computational resources and can be accessed from all such resources at a centre. Key features include the ability to access the same filesystem the same way on all computational resources at a centre, and a unified structure and nomenclature for all centra. Unlike cluster storage which is tightly associated with a single cluster, and thus has a limited life-time, centre storage does not require the users to migrate their own data when clusters are decommissioned, not even when the storage hardware itself is being replaced.<br />
<br />
== Unified environment ==<br />
To make the usage more transparent for SNIC users, a set of environment variables are available on all SNIC resources:<br />
<br />
* <code>SNIC_BACKUP</code> – the user's primary directory at the centre<br>(the part of the centre storage that is backed up)<br />
* <code>SNIC_NOBACKUP</code> – recommended directory for project storage without backup<br>(also on the centre storage)<br />
* <code>SNIC_TMP</code> – recommended directory for best performance during a job<br>(local disk on nodes if applicable)</div>Tom Langborg (NSC)https://snicdocs.nsc.liu.se/w/index.php?title=Swestore-dCache&diff=4218Swestore-dCache2012-06-14T14:38:12Z<p>Tom Langborg (NSC): </p>
<hr />
<div>[[Category:Storage]]<br />
[[Category:SweStore]]<br />
SNIC is building a storage infrastructure to complement the computational resources.<br />
<br />
Many forms of automated measurements can produce large amounts of data. In scientific areas such as high energy physics (the Large Hadron Collider at CERN), climate modeling, bioinformatics, bioimaging etc., the demands for storage are increasing dramatically. To serve these and other user communities, SNIC has appointed a working group to design a storage strategy, taking into account the needs on many levels and creating a unified storage infrastructure, which is now being implemented.<br />
<br />
Swestore is in collaboration with [http://www.ecds.se ECDS], [http://snd.gu.se SND], Bioimage Sweden, [http://www.bils.se BILS], [http://www.uppnex.uu.se UPPNEX],[http://http://lcg.web.cern.ch/lcg/public/ WLCG], [http://www.nrm.se/ NaturHistoriska RiksMuseet].<br />
<br />
= National storage =<br />
The aim of the nationally accessible storage is to build a robust, flexible and expandable system that can<br />
be used in most cases where access to large scale storage is needed. To the user it should appear as a single large system,<br />
while it is desirable that some parts of the system are distributed across all SNIC centra to benefit from the advantages<br />
of, among other things, locality and cache effects. The system is intended as a versatile long-term storage system.<br />
<br />
== Getting access ==<br />
; Apply for storage<br />
: Please follow instructions [[Apply for storage on SweStore|here]]<br />
; Get a client certificate.<br />
: Follow the instructions [[Grid_certificates#Requesting_a_certificate|here]] to get your client certificate. For Terena certificates, please make sure you also [[Requesting_a_grid_certificate_using_the_Terena_eScience_Portal#Exporting Terena certificate for use with Grid tools|export the certificate for use with grid tools]]. For Nordugrid certificates, please make sure to also [[Requesting_a_grid_certificate_from_the_Nordugrid_CA#Installing_the_certificate_in_your_browser|install your client certificate in your browser]].<br />
; Request membership in the SweGrid VO.<br />
: Follow the instructions [[Grid_certificates#Requesting_membership_in_the_SweGrid_VO|here]] to get added to the SweGrid virtual organisation.<br />
<br />
== Download and upload data ==<br />
; Browse and download data<br />
: SweStore is accessible from your web browser, here http://webdav.swegrid.se/. To browse private data you must first install your certificate in your browser (see above). Your data is available at <code><nowiki>http://webdav.swegrid.se/snic/YOUR_PROJECT_NAME</nowiki></code>.<br />
; Upload and delete data<br />
: Use the ARC client. Please see the instructions for [[Accessing SweStore national storage with the ARC client]].<br />
<br />
== Examples of storage projects ==<br />
Below are some examples of project that are using SweStore today.<br />
<br />
{|border="1" style="text-align:left; border-collapse: collapse; border-width: 1px; border-style: solid; border-color: #000" class="wikitable sortable" valign=top<br />
!Allocation name<br />
!Size in TB<br />
!class="unsortable"|Project full name<br />
|-<br />
|alice<br />
|400<br />
|<br />
|-<br />
|uppnex<br />
|140<br />
|[https://www.uppnex.uu.se UPPmax NExt Generation Sequencing Cluster & Storage]<br />
|-<br />
|brain_protein_atlas<br />
|10<br />
|Mouse brain protein atlas project<br />
|-<br />
| scims2lab<br />
|20<br />
| Identification of novel gene models by matching mass spectrometry data against 6-frame translations of the human genome<br />
|-<br />
|subatom<br />
|<br />
|Low-energy nuclear theory and experiment<br />
|-<br />
|genomics-gu<br />
|10<br />
|Genomics Core Facility, Sahlgrenska academy at University of Gothenburg.<br />
|-<br />
|Chemo<br />
|5TB<br />
|Genetic interaction networks in human deseas<br />
|-<br />
|cesm1_holocene<br />
|30<br />
|Arctic sea ice in warm climates<br />
|}<br />
<br />
== More information ==<br />
* [[SweStore introduction]]<br />
* [http://status.swestore.se/munin/monitor/monitor/ Per Project Monitoring of Swestore usage]<br />
* [[Accessing SweStore national storage with the ARC client]]<br />
<!-- * [[Mounting SweStore national storage via WebDAV|Mounting SweStore national storage via WebDAV (Not recomendated at the moment)]] --><br />
If you have any issues using SweStore please do not hesitate to contact [mailto:swestore-support@snic.vr.se swestore-support].<br />
<br />
= Center storage =<br />
Centre storage, as defined by the SNIC storage group, is a storage solution that lives independently of the computational resources and can be accessed from all such resources at a centre. Key features include the ability to access the same filesystem the same way on all computational resources at a centre, and a unified structure and nomenclature for all centra. Unlike cluster storage which is tightly associated with a single cluster, and thus has a limited life-time, centre storage does not require the users to migrate their own data when clusters are decommissioned, not even when the storage hardware itself is being replaced.<br />
<br />
== Unified environment ==<br />
To make the usage more transparent for SNIC users, a set of environment variables are available on all SNIC resources:<br />
<br />
* <code>SNIC_BACKUP</code> – the user's primary directory at the centre<br>(the part of the centre storage that is backed up)<br />
* <code>SNIC_NOBACKUP</code> – recommended directory for project storage without backup<br>(also on the centre storage)<br />
* <code>SNIC_TMP</code> – recommended directory for best performance during a job<br>(local disk on nodes if applicable)</div>Tom Langborg (NSC)https://snicdocs.nsc.liu.se/w/index.php?title=SweStore_introduction&diff=3780SweStore introduction2012-03-15T12:18:40Z<p>Tom Langborg (NSC): /* Technical overview */</p>
<hr />
<div><br />
PAGE VERY MUCH IN PROGRESS<br />
<br />
<br />
== Technical overview ==<br />
<br />
The SweStore National Storage infrastructure is implemented using the<br />
distributed storage solution [http://www.dcache.org dCache].<br />
<br />
Slides on SweStore National Storage [[media: TekniskbeskrivningSweStore.pdf]]<br />
<br />
The core services of the system is located at HPC2N at Umeå<br />
University. There are over 65 (in Nov 2011) online storage pools<br />
attached to the system and they are located at Lunarc, C3SE, NSC, PDC,<br />
Uppmax and HPC2N. A file upload usually go directly from the source to<br />
one of the storage pools without going through the core service which<br />
gives a high aggregated transfer performance for the system. All files<br />
in the SNIC part of storage are replicated on a different site for<br />
availability reasons. Unless the core services are unreachable, an<br />
entire site can be offline without any loss of functionality for<br />
SweStore. <br />
<br />
There are several access protocols for SweStore National Storage. The<br />
primary ones being the SRM and WebDAV.<br />
<br />
SweStore currently uses certificates for authentication. Please see,<br />
[[Grid certificates]], for information on how to get and manage<br />
certificates.<br />
<br />
There are no backups of data on SweStore. The files are replicated to<br />
minimize the risk of data loss due to hardware problems. But if the<br />
end user deletes a file it will be lost.<br />
<br />
There are currently no tape backend attached to SweStore, but that may<br />
be changed in the future. The tape would be used for archiving data<br />
that hasn't been accessed for a long time. <br />
<br />
One very nice aspect of SweStore access is that staging data in and<br />
out from grid clusters with ARC works really nice. Instead of copying<br />
data to the clusters and then copying the results out again you can<br />
let ARC do that for you. Just specify the input file URL:s you need<br />
and let ARC do the work for you.</div>Tom Langborg (NSC)