OSG Newsletter October 2011

Open Science Grid

<input type="hidden" name="cx" value="013595509286489083271:u6xwj4_zggg" />

Search OSG using Google:

<input type="hidden" name="cx" value="013595509286489083271:ti_1udyxef8" />
Search OSG at Work:

   

Can't find what you're looking for?

OSG Logo


OSG Newsletter, October 2011

Running OpenSees? Production for NEES on OSG

OpenSeesNEES ( George E. Brown, Jr. Network for Earthquake Engineering Simulation) researchers at the University of California, San Diego have been running OpenSees on the Open Science Grid. The goal has been to conduct parametric studies of large-scale nonlinear models of structure systems (buildings and bridges) or soil-structure systems responses to earthquakes.

Two types of applications at two scales of computational complexity are under study. The first is a probabilistic seismic demand hazard analysis of the 13-story National Earthquake Hazards Reduction Program (NEHRP) building design example. This consist of 180 nonlinear time-history analyses of an advanced nonlinear finite element (FE) model of a building with a total estimated clock time of 2,160 hours and 250GB of compressed data generated.

The second type of application is studying the sensitivity of the probabilistic seismic demand hazard to FE model parameters of the same NEHRP building model. The estimated clock time required on a single desktop computer would be about 12 years, and the estimated compressed data produced is 12 TB. The experience of OSG in the management, transfer and analysis of large research data sets has proven to be of great importance. The support provided by OSG and the RENCI team included selecting data compression algorithms, tuning the Globus Online service for data transfers between the submit node and the users' local machine, and improving other software and hardware issues related to the processing of large datasets. A smaller proof-of-concept project that OSG had done earlier this year was helpful in understanding some of these areas.

As comprehensive validation of numerical models for structural systems becomes increasingly important, the opportunistic usage of OSG's computational resources offers great leverage in terms of computing power for NEES researchers. One ongoing effort is to enable and improve, with the help of the OSG User Support team, the running of jobs that are up to 97 hours long. This would allow running analyses for still more complicated models. Also, as of this summer it is possible to submit jobs to OSG through the NEEShub portal rather than through direct job submission at an OSG host. A first OpenSees? front-end job submission tool has been developed using the NEEShub framework, and efforts for making it more capable and accessible for different researchers are also under way.

~ André R. Barbosa, Joel P. Conte, José I. Restrepo, UCSD


CMS Production

In October, proton-proton data taking wrapped up for the year for the LHC. It was a very successful run, settings records for instantaneous luminosity, with more than 5.3 inverse femtobarns delivered to CMS. We are fully utilizing our Tier 1 resources across the globe, as well as executing more than 50 000 analysis jobs in parallel on our Tier 2s, many of which using software provided and supported by the Open Science Grid. We are now getting ready for an intense month-long heavy ion run and a well-deserved break before proton-proton data taking resumes in March.

~ Burt Holzman


From the Executive Director

Several members of the OSG will be at SC11 in Seattle and we look forward to seeing you there. The Fermilab, Indiana, Purdue, and University of Florida booths all talk about Open Science Grid at some level. If there are others, please let me know, and we will add them to the information we will post.

I am pleased to report reuse of our 2011 OSG Summer School material by the University of Nebraska and in a related course in the University of Wisconsin­Madison Computer Sciences department. The original School material is posted here posted here.

In addition, GridColombia? started training instructors of institutions from all of Colombia using some of the content from the 2011 Summer School. We hope that our online repository of instructional materials can evolve into an increasingly valuable resource.

I would like to welcome Kevin Hill as a new member of the Security Team. Kevin will be working in Security Operations: (some of you may have seen his recent email on one of the vulnerabilities we are facing). We also welcome Von Welch who is consulting on the Identity Management work within OSG. We will have some more communication on this in the next month or two.

~ Ruth Pordes


ATLAS and CMS Thumbnails

ATLAS ThumbnailCMS Thumbnail

Featured Site: T3 US UMD, University of Maryland Tier 3 HEP CMS Computing Cluster

T3

Site name:T3_US_UMD, the University of Maryland Tier 3 HEP CMS Computing Cluster
Location: College Park, Maryland
Owner: High Energy Physics CMS group, University of Maryland
Cores: Including management nodes: 160 cores 232 job slots (taking advantage of hyperthreading in R510s)
Storage: 9 TB on RAIDed NFS disk for local usage. For Storage Element, we have 66 TB on distributed Hadoop disk.
Used by: University of Maryland High Energy Physics; University of Maryland Nuclear Chemistry CMS group
Opportunistic access: Opportunistic use of our site by members of the CMS VO is encouraged.

What is unusual about your site and the challenges you face?
We have extensively documented the setup and configuration of our site and are maintaining the documentation as a resource for other system administrators in the CMS Tier3 community.

What projects are you working on now?
We've recently changed our Storage Element to use the Hadoop Distributed File System to allow for up to 66 TB of dataset storage, up from 9TB. It is in use now and we are looking forward to seeing the system tested and making sure it is stable.

If you could change one thing about OSG what would it be?
Having significant upgrades well documented for non-expert system administrators so that the upgrade experience could be faster.

~ Dr. Marguerite Tonjes


EGI Technical Forum

Henry Kissinger famously asked, “Who do I call if I want to call Europe?” I was reminded of this quote when I attended the EGI Technical forum in Lyon, France and then visited CERN in September. While I don’t need to call Europe as a whole, as Operations Coordinator for OSG I may need to contact any one of the 30+ National Grid Infrastructures (NGI) that make up the European Grid Infrastructure (EGI) in times of an operational crisis.

Accompanied by Scott Teige the OSG Operations Technical Lead, we presented material on both technical and personal communications between OSG, WLCG, and EGI including Global Grid User Support System (GGUS) ticket synchronization, availability and reliability reporting, Berkeley Database Information Index (BDII) information exchange, and various other ways to keep communication channels open between OSG and our European counterparts. We came away with several technical action items from the SAM team. In addition, OSG Operations took a seat as a non-voting member of the EGI Operations Management Board. We are also participating in the WLCG Operations Technical Evaluation Group which will provide input to chart the future activities within the WLCG. We look forward to continued collaboration with WLCG and EGI on an operational level.

~ Rob Quick


OSG Document Database updated in the Last 30 Days

OSG-doc-#

Title

Author(s)

Topic(s)

Last Updated

1078-v1

US LHC Tier2 Activity for September 2011

Adam Caprez

Metrics

18 Oct 2011

507-v1

OSG NERSC Allocation Request

Doug Olson

Agreements

05 Oct 2011

1076-v1

Gratia: New Challenges in Grid Accounting.

Philippe Canal

Accounting

03 Oct 2011


Save the Date:

The next OSG All Hands Meeting will be March 19-22nd, 2012 in Lincoln, Nebraska!

-- RuthPordes - 04 Feb 2012

  • OpenSees?.jpg:
    OpenSees.jpg

  • T3UMD?.jpg:
    T3UMD.jpg
Topic attachments
I Attachment Action Size Date Who Comment
jpgjpg OpenSees.jpg manage 26.9 K 04 Feb 2012 - 16:27 RuthPordes  
jpgjpg T3UMD.jpg manage 33.9 K 04 Feb 2012 - 16:27 RuthPordes  
Topic revision: r1 - 04 Feb 2012 - 16:27:29 - RuthPordes
Hello, TWikiGuest
Register

 
TWIKI.NET

TWiki | Report Bugs | Privacy Policy

This site is powered by the TWiki collaboration platformCopyright by the contributing authors. All material on this collaboration platform is the property of the contributing authors..