Difference between revisions of "Bioscope"

From SNIC Documentation
Jump to: navigation, search
(Created page with "__NOTOC__ {{software info |description=the pipeline stack that comes with the solid sequence platform |license=site license |fields=bioinformatics }} SOLiD [http://solidsoftware...")
 
m (Tips and tricks)
 
Line 11: Line 11:
  
 
== Tips and tricks ==
 
== Tips and tricks ==
On UPPMAX/UPPNEX you find bioscope in the module systems on the Kalkyl cluster.  
+
On UPPMAX/[https://www.uppnex.uu.se UPPNEX] you find bioscope in the module systems on the Kalkyl cluster.  
  
 
Please type :
 
Please type :

Latest revision as of 08:43, 30 September 2011

SOLiD Bioscope Bioscope provides a command line for running application-specific sequence analysis tools. The Bioscope framework enables the user to perform off-instrument secondary and tertiary analyses, and it allows configurable bioinformatics workflows for resequencing (mapping, SNP finding (diBayes), copy number variations, inversions, small indels, large indels) and whole transcriptome analysis (mapping, counting, novel transcript finding, UCSC WIG Files creation) Results will be in GFF v3 and SAM formats. The resulting industry-standard files from Bioscope can be used with third-party visualization and analysis software tools.

Availability

ResourceCentreDescription
KalkylUPPMAXcluster resource of about 21 TFLOPS

Tips and tricks

On UPPMAX/UPPNEX you find bioscope in the module systems on the Kalkyl cluster.

Please type :

module load bioinfo-tools bioscope

To load the module. the command is then

bioscope.sh -A b2010999

To run on Kalkyl you need to specify what project shall be accounted for your job run.

Bioscope integrates to the SLURM queueing  system  there so you only need to start bioscope.sh on a login node.

We recommend you run it in the background by using "noup"

nohup ./run.sh MatoBam_nohup.out

Where run.sh have the run line bioscope.sh -A b2010999 -l MaToBam.log MaToBam.plan

Nohup will run the job in the background and you can monitor the progress by reading the MatoBam_nohup.out.

Bioscope can today use at the most 11 nodes for one bioscope run, and you can at the most start two parallel runs. This is due to the limit in the SLURM system. If you need to run several different bioscope runs. I recommend you running them one after another.

Find attached the user manual to this post you need to be loged in on a registered account to read the attachment..

You can also find some test data on the system under the folder:

/bubo/nobackup/uppnex/bioscopedata

License

License: Site license.

Experts

No experts have currently registered expertise on this specific subject. List of registered field experts:

  FieldAE FTEGeneral activities
Anders Hast (UPPMAX)UPPMAXVisualisation, Digital Humanities30Software and usability for projects in digital humanities
Anders Sjölander (UPPMAX)UPPMAXBioinformatics100Bioinformatics support and training, job efficiency monitoring, project management
Anders Sjöström (LUNARC)LUNARCGPU computing
MATLAB
General programming
Technical acoustics
50Helps users with MATLAB, General programming, Image processing, Usage of clusters
Birgitte Brydsö (HPC2N)HPC2NParallel programming
HPC
Training, general support
Björn Claremar (UPPMAX)UPPMAXMeteorology, Geoscience100Support for geosciences, Matlab
Björn Viklund (UPPMAX)UPPMAXBioinformatics
Containers
100Bioinformatics, containers, software installs at UPPMAX
Chandan Basu (NSC)NSCComputational science100EU projects IS-ENES and PRACE.
Working on climate and weather codes
Diana Iusan (UPPMAX)UPPMAXComputational materials science
Performance tuning
50Compilation, performance optimization, and best practice usage of electronic structure codes.
Frank Bramkamp (NSC)NSCComputational fluid dynamics100Installation and support of computational fluid dynamics software.
Hamish Struthers (NSC)NSCClimate research80Users support focused on weather and climate codes.
Henric Zazzi (PDC)PDCBioinformatics100Bioinformatics Application support
Jens Larsson (NSC)NSCSwestore
Jerry Eriksson (HPC2N)HPC2NParallel programming
HPC
HPC, Parallel programming
Joachim Hein (LUNARC)LUNARCParallel programming
Performance optimisation
85HPC training
Parallel programming support
Performance optimisation
Johan HellsvikPDCMaterialvetenskap30materials theory, modeling of organic magnetic materials,
Johan Raber (NSC)NSCComputational chemistry50
Jonas Lindemann (LUNARC)LUNARCGrid computing
Desktop environments
20Coordinating SNIC Emerging Technologies
Developer of ARC Job Submission Tool
Grid user documentation
Leading the development of ARC Storage UI
Lunarc Box
Lunarc HPC Desktop
Krishnaveni Chitrapu (NSC)NSCSoftware development
Lars Eklund (UPPMAX)UPPMAXChemistry
Data management
FAIR
Sensitive data
100Chemistry codes, databases at UPPMAX, sensitive data, PUBA agreements
Lars Viklund (HPC2N)HPC2NGeneral programming
HPC
HPC, General programming, installation of software, support, containers
Lilit Axner (PDC)PDCComputational fluid dynamics50
Marcus Lundberg (UPPMAX)UPPMAXComputational science
Parallel programming
Performance tuning
Sensitive data
100I help users with productivity, program performance, and parallelisation. I also work with allocations and with sensitive data questions
Martin Dahlö (UPPMAX)UPPMAXBioinformatics10Bioinformatic support
Matias Piqueras (UPPMAX)UPPMAXHumanities, Social sciences70Support for humanities and social sciences, machine learning
Mikael Djurfeldt (PDC)PDCNeuroinformatics100
Mirko Myllykoski (HPC2N)HPC2NParallel programming
GPU computing
Parallel programming, HPC, GPU programming, advanced support
Pavlin Mitev (UPPMAX)UPPMAXComputational materials science100
Pedro Ojeda-May (HPC2N)HPC2NMolecular dynamics
Machine learning
Quantum Chemistry
Training, HPC, Quantum Chemistry, Molecular dynamics, R, advanced support
Peter Kjellström (NSC)NSCComputational science100All types of HPC Support.
Peter Münger (NSC)NSCComputational science60Installation and support of MATLAB, Comsol, and Julia.
Rickard Armiento (NSC)NSCComputational materials science40Maintainer of the scientific software environment at NSC.
Szilard PallPDCMolecular dynamics55Algorithms & methods for accelerating molecular dynamics, Parallelization and acceleration of molecular dynamics on modern high performance computing architectures, High performance computing, manycore and heterogeneous architectures, GPU computing
Thomas Svedberg (C3SE)C3SESolid mechanics
Torben Rasmussen (NSC)NSCComputational chemistry100Installation and support of computational chemistry software.
Wei Zhang (NSC)NSCComputational science
Parallel programming
Performance optimisation
code optimization, parallelization.
Weine Olovsson (NSC)NSCComputational materials science90Application support, installation and help
Åke Sandgren (HPC2N)HPC2NComputational science50SGUSI

Links