location: Diff for "compute_resources"

Institute of Mathematics - PublicMathWiki:

Differences between revisions 1 and 15 (spanning 14 versions)
Revision 1 as of 2015-04-24 09:12:19
Size: 3281
Editor: crose
Comment:
Revision 15 as of 2015-04-24 14:00:25
Size: 5947
Editor: crose
Comment:
Deletions are marked like this. Additions are marked like this.
Line 2: Line 2:
 * In general, the offers are limited to members of I-MATH and their students.
 * Other UZH members are welcome to contact support@math.uz.ch.
Line 5: Line 7:
 * For medium computations, please contact support@math.uzh.ch to reserve some compute nodes and to ask for the best fitting possibilty.  * For medium computations, please contact support@math.uzh.ch to ask for the best fitting possibilty and to reserve nodes.
Line 9: Line 11:
 *
|| host || CPU Type || number cores || RAM || local disk || date || purpose ||
|| asprey || 2 x Intel Xeon 4C E5-2643 3.30 GHz || 16 (HT enabled) || 128 GB || 73 GB || 12/2012 || student: matlab, mathenatica, maple, R ||
|| baxter || 8 x Intel Xeon 8C X7550 2.00 GHz || 64 (HT disabled) || 512 GB || 300 GB || 10/2010 || research, magma ||
|| crous || 8 x Intel Xeon 10C E7-2850 2.0 GHz || 80 (HT disabled) || 2 TB || 300 GB || 05/2013 || research ||

||<rowbgcolor="#dddddd"> host || CPU type || cores || RAM || local disk || date || purpose || CPU specs ||
|| asprey || 2 x Intel Xeon 4C E5-2643 3.30 GHz || 16 (HT enabled) || 128 GB || 73 GB || 12/2012 || student: matlab, mathematica, maple, R || [[http://ark.intel.com/products/64587/Intel-Xeon-Processor-E5-2643-10M-Cache-3_30-GHz-8_00-GTs-Intel-QPI|E5-2643]] ||
|| baxter || 8 x Intel Xeon 8C X7550 2.00 GHz || 64 (HT disabled) || 512 GB || 300 GB || 10/2010 || research, magma || [[http://ark.intel.com/products/46498/Intel-Xeon-Processor-X7550-18M-Cache-2_00-GHz-6_40-GTs-Intel-QPI?wapkw=x7550|x7550]] ||
|| crous || 8 x Intel Xeon 10C E7-2850 2.0 GHz || 80 (HT disabled) || 2 TB || 300 GB || 05/2013 || research || [[http://ark.intel.com/products/53573/Intel-Xeon-Processor-E7-2850-24M-Cache-2_00-GHz-6_40-GTs-Intel-QPI?wapkw=e7-2850|E7-2850]] ||
Line 15: Line 17:
|| estonia0 || 2 x Intel Xeon 6C E5-2640 2.50 GHz || 24 (HT enabled) || 256 GB || 2TB || 05/2013 || research ||
|| estonia1 || " || "          || "   || " || " || " ||
|| estonia0 || 2 x Intel Xeon 6C E5-2640 2.50 GHz || 24 (HT enabled) || 256 GB || 2TB || 05/2013 || research || [[http://www.intel.com/buy/us/en/product/components/intel-xeon-processor-e5-2640-15m-cache-250-ghz-720-gts-intel-qpi-250678?wapkw=e5-2640#tech_specs|E5-2640]] ||
|| estonia1 || " || 12 (HT disabled) || " || " || "    || "                 ||
Line 27: Line 29:
|| jordan0 || " || " || " || " || " || professor & assistant: matlab, mathenatica, maple, R || || jordan0 || " || " || " || " || " || professor & assistant: matlab, mathematica, maple, R ||
Line 33: Line 35:
= Computations at I-MATH: concept =
 * There is no distributed compute cluster environment at I-MATH.
 * Various MPI packages and libs are installed.
= Concept =

== Access ==
 * The compute nodes can be accessed by SSH only from '''inside''' I-MATH.
 * Use [[thinlinc]]
  * matlab, mathematica, maple, R, rstudio: just start the application via menu 'Applications > Science > ...' - you'll automatically redirected to the appropriate compute server.
  * For non standard compute software please first log on the compute node via 'ssh', than start the program.
 * SSH: Log on 'ssh.math.uzh.ch' and jump to the compute node again via 'ssh'.

== Operating System ==
 * Ubuntu LTS (Linux) - all compute servers
 * Very very limited software might be installed on some Windows virtual machines.

== Storage ==
 * If you need more than 1GB disk space, please contact support@math.uzh.ch.
 * Remote (NFS)
  * <1GB disk space: Personal Home directory
  * >1GB disk space: `/compute/<account>`
 * Local
  * `/export/user/<account>`

== Programs / Software ==
 * All compute servers use the same applications and versions as installed on the thinlinc terminals.
 * Please report missing or outdated software or wishes to support@math.uzh.ch.
 * There is no Intel C or Fortran Compiler.
 * If you need other program versions than the default one: open a terminal, type the program name followed by two 'TAB' presses: this will show all available versions of the specified program.

== Parallel computing ==
 * Cluster software
  * Not installed / offered at I-MATH.
  * Various MPI packages and libs are installed. Compiling and preparation to run programs later on a distributed compute cluster are possible.
 * Some programs, like Matlab, offers limited builtin auto parallelization.
 * All of the compute nodes are shared memory machines. It's reasonable to start as many programs in parallel as long the `load` is less or qual the number of cores.
  * Determine the load by using the `uptime` command inside a terminal. The last three numbers are the average number of jobs (=load) in the run queue over the last 1, 5 and 15 minutes. {{{
$ uptime
 15:55:20 up 22 days, 24 min, 1 user, load average: 11.97, 11.73, 11.62
}}}

Compute Resources

  • In general, the offers are limited to members of I-MATH and their students.
  • Other UZH members are welcome to contact support@math.uz.ch.

Offers

  • For small / ad hoc computations, just start the compute software via the thinlinc environment.
  • For medium computations, please contact support@math.uzh.ch to ask for the best fitting possibilty and to reserve nodes.

  • For large computations and professional support please check http://www.s3it.uzh.ch/.

Resources at IMATH

host

CPU type

cores

RAM

local disk

date

purpose

CPU specs

asprey

2 x Intel Xeon 4C E5-2643 3.30 GHz

16 (HT enabled)

128 GB

73 GB

12/2012

student: matlab, mathematica, maple, R

E5-2643

baxter

8 x Intel Xeon 8C X7550 2.00 GHz

64 (HT disabled)

512 GB

300 GB

10/2010

research, magma

x7550

crous

8 x Intel Xeon 10C E7-2850 2.0 GHz

80 (HT disabled)

2 TB

300 GB

05/2013

research

E7-2850

david

"

"

"

"

"

"

estonia0

2 x Intel Xeon 6C E5-2640 2.50 GHz

24 (HT enabled)

256 GB

2TB

05/2013

research

E5-2640

estonia1

"

12 (HT disabled)

"

"

"

"

estonia2

"

"

"

"

"

"

estonia3

"

"

"

"

"

"

georgia0

"

"

"

"

"

"

georgia1

"

"

"

"

"

"

georgia2

"

"

"

"

"

"

georgia3

"

"

"

"

"

"

iran0

"

"

"

"

"

"

iran1

"

"

"

"

"

"

iran2

"

"

"

"

"

"

iran3

"

"

"

"

"

"

jordan0

"

"

"

"

"

professor & assistant: matlab, mathematica, maple, R

jordan1

"

"

"

"

"

"

jordan2

"

"

"

"

"

courses

jordan3

"

"

"

"

"

courses

Concept

Access

  • The compute nodes can be accessed by SSH only from inside I-MATH.

  • Use thinlinc

    • matlab, mathematica, maple, R, rstudio: just start the application via menu 'Applications > Science > ...' - you'll automatically redirected to the appropriate compute server.

    • For non standard compute software please first log on the compute node via 'ssh', than start the program.
  • SSH: Log on 'ssh.math.uzh.ch' and jump to the compute node again via 'ssh'.

Operating System

  • Ubuntu LTS (Linux) - all compute servers
  • Very very limited software might be installed on some Windows virtual machines.

Storage

  • If you need more than 1GB disk space, please contact support@math.uzh.ch.

  • Remote (NFS)
    • <1GB disk space: Personal Home directory

    • >1GB disk space: /compute/<account>

  • Local
    • /export/user/<account>

Programs / Software

  • All compute servers use the same applications and versions as installed on the thinlinc terminals.
  • Please report missing or outdated software or wishes to support@math.uzh.ch.

  • There is no Intel C or Fortran Compiler.
  • If you need other program versions than the default one: open a terminal, type the program name followed by two 'TAB' presses: this will show all available versions of the specified program.

Parallel computing

  • Cluster software
    • Not installed / offered at I-MATH.
    • Various MPI packages and libs are installed. Compiling and preparation to run programs later on a distributed compute cluster are possible.
  • Some programs, like Matlab, offers limited builtin auto parallelization.
  • All of the compute nodes are shared memory machines. It's reasonable to start as many programs in parallel as long the load is less or qual the number of cores.

    • Determine the load by using the uptime command inside a terminal. The last three numbers are the average number of jobs (=load) in the run queue over the last 1, 5 and 15 minutes.

      $ uptime
       15:55:20 up 22 days, 24 min,  1 user,  load average: 11.97, 11.73, 11.62

PublicMathWiki: compute_resources (last edited 2024-04-11 09:31:49 by crose)