OSG Newsletter July 2010

Open Science Grid

<input type="hidden" name="cx" value="013595509286489083271:u6xwj4_zggg" />

Search OSG using Google:

<input type="hidden" name="cx" value="013595509286489083271:ti_1udyxef8" />
Search OSG at Work:


Can't find what you're looking for?

 Open Science Grid logo

OSG Newsletter, July 2010

OSG Annual Report to the National Science Foundation

The OSG Annual Report to NSF is now available and we recommend that you to read it, along with the attachments, both of which show the tremendous accomplishments and progress we have made this year. The overall usage of OSG continues to grow, while utilization by each stakeholder varies depending on its needs during any particular interval. Overall use of the facility for the 12-month period ending June 1, 2010 was 272M hours, compared to 182M hours for the previous 12 months—a 50% increase.

Non-HEP usage has increased substantially over the past year—primarily due to the increased ability of LIGO to submit Einstein@Home jobs supporting the pulsar analysis. From June 2009 to June 2010, the percentage of non-HEP usage increased from 4.5 to 20, more than a four-fold increase in the fractional use. LIGO accounts for approximately 95K hours/day (11% of the total), and non-physics use now averages 80K hours/day (9% of the total), reflecting efforts over the past year to support SBGRID and incorporate new resources, such as Nebraska’s Holland Computing Center.

Please click here to view the OSG report.

~ Paul Avery, Chander Sehgal


Novel approaches to finding new proteins with Open Science Grid

The Sliz Lab at Harvard Medical School focuses on computational problems in protein structure analysis. A key step in structure analysis is the determination of high resolution representations (atomic scale) of the 3D structure of biologically significant proteins. Even using conventional approaches, this is a computationally intensive task. Over the past year, the group at Sliz Lab has used the OSG Match Maker ,and more recently, the glidein WMS system, to harness opportunistic compute cycles from two dozen OSG sites, regularly sustaining over 2000 concurrent jobs and completing in excess of 40,000 CPU hours per day.

This compute power has allowed the group to establish an important data processing workflow used to translate protein crystal scattering data from X-ray crystallography into a resolved, atomic scale model of the protein.

A single iteration of this workflow can take over 20,000 CPU hours, making it impractical for typical lab or department compute clusters and certainly beyond what can be achieved with desktop computing. To date, several structures have been determined using this technique in cases where the standard approaches had failed. The Sliz Lab is in the process of providing a publicly accessible web portal to allow researchers to submit their protein X-ray data. Open Science Grid has been the enabling technology of this approach, and the group looks forward to developing other novel grid-based techniques for protein structure computations.

~ Ian Stokes-Reese, Chander Sehgal


OGF Standards

Just prior to the June Open Grid Forum meeting, I learned that I was selected as the new OGF Vice President for Standards. I am surprised and honored by this selection. I hope to encourage and promote work that will be helpful to set and improve the adoption of working standards throughout all portions of the grid and cloud computing community and to continue OGF’s emphasis on open standards. I hope we will all do new things that we have not had the chance to try. I will listen hard to the community as we go forward, and I have already learned much from you as colleagues in grid development.

As an early participant in the International Grid Trust Federation, I am very interested in seeing continued progress in streamlining, hardening and strengthening the role of strong authentication in areas of grid and cloud computing that require strong control of access credentials, while leaving room for lightweight standards that are appropriate for high-transaction-rate (e.g., cloud) computing.

OSG’s emphasis on commonality, best practices and harmonization make it a natural focus for contribution to the standards process. I look forward to input from the grid and cloud computing communities, both on the development and user sides of the equation.

~ Alan Sill


From the Executive Director

It is thrilling to see the many physics results being presented from the ATLAS, CDF, CMS, and DZero at the International Conference on High Energy Physics conference in Paris this week and know that We Contributed! The end science is what we are here to help – be it physics this week or biology next. Thank you to the great team for making our part all possible and productive!

Last week we held a successful OSG summer school with 17 students from a diversity of disciplines. In this newsletter I will just acknowledge the superlative job done by the organizers, and teachers. We will give more details next month when people have time to write up something more fully.

We held the annual staff retreat and a blueprint discussion on Campus infrastructure at the same time as the summer school – it was a busy week. In August there is the site administrators meeting at Vanderbilt and in September the storage workshop. While emails and phone calls are good it is the few face to face meetings we have that really move our work and our collaboration forward. We hope many of you can make these meetings

~ RuthPordes?, Executive Director



OSG Summer School 2010

OSG Summer School

The Open Science Grid Consortium recently hosted a summer school on high-throughput computing (HTC). Seventeen students attended and learned how to use HTC both locally on their campus and nationally with OSG to to run large-scale computing applications. The four-day event emphasized hands-on exercises in addition to the lectures, to give the students a deep understanding of the techniques they learned. The lecturers were developers and heavy users of HTC, so they were able to share their real-life experience with the students.

The school started out by introducing students to local high-throughput computing by using Condor, then expanded into the use of grid technologies for computing and storage. It included usage of the latest glide-in (a.k.a pilot job) technologies to efficiently scale computations up across many grid sites.

At the end of the school, there were four lectures by local science users who shared their HTC experiences. One of them, Edgar Spalding, talked about how HTC changed his science: it wasn't just a matter of being able to do more computation, but it was a fundamental shift in the kinds of science he could do.

The students came from a wide variety of backgrounds: physics, biology, GIS, computer science, and more. Many of them have immediate needs for HTC, while others are thinking about how to improve our computational infrastructure.

OSG hopes to host a similar school next summer.

For more information about the OSG Summer School 2010.

~ Alain Roy

OSG Summer School


CMS Production Report

The CMS experiment continues to take data as the LHC has steadily
increased luminosity. In preparation for the International Conference on High Energy Physics (ICHEP) conference, we were able to run two reprocessing passes on the newest data, adding about 100 inverse nanobarns and roughly doubling the amount of analyzed data — all thanks to CMS resources made available by OSG.

~ Burt Holzman


*Special Announcement*

Mark your calendars already! The next OSG All Hands will be taking place March 7th, 2011 in Boston, MA hosted by the SBGrid VO and Harvard Medical School.

We look forward to seeing you all there!

~ Ruth Pordes, Executive Director


WLCG Data and Storage Management Jamboree

About one hundred people gathered in Amsterdam, The Netherlands on June 16 to 19 to discuss the evolution of WLCG data and storage management. The jamboree, hosted by NIKEF, took place in the heart of the city on one of the oldest canals, Keizersgracht. Representatives from various institutions and all major HEP experiments attended. The purpose of the meeting was to discuss how significant advances in technology could help modernize storage architecture built on the Monarc model developed over ten years ago. The new approaches should provide better storage scalability and performance, global namespace for data access, a more reliable way for dataset transfers and synchronization with the data catalogs.

Many other topics were discussed as well, including network requirements, improving access to data for analysis jobs, P2P? data caching and asynchronous staging of output jobs. There were several teams that proposed various approaches to solve current problems. Some of them were selected to demonstrate that their approach works, and they will periodically report their progress to the Grid Deployment Board. The goal is to have working prototypes by the end of the year and to be ready for production by 2013.

The excitement and lively discussions generated during the meeting were almost comparable to the enthusiasm of the Netherlands soccer fans that had gathered in the nearby bars all dressed in orange!

~ Tanya Levshina




Documents Modified in the Last 20 Days

OSG-doc-# Title Author(s) Topic(s) Last Updated
977-v1 Assessment of Core Services provided to U.S. ATLAS and U.S. CMS by OSG Lothar A.T. Bauerdick et al. Reports
19 Jul 2010
976-v1 A Science Driven Production Cyberinfrastructure - the Open Science Grid Executive Board Communications
14 Jul 2010
975-v1 iSGTW Status Update Feb 2010 Miriam Boon Resources
12 Jul 2010
974-v1 iSGTW Status Update Dec 2009 Miriam Boon Resources
12 Jul 2010
973-v1 iSGTW Status Update Oct 2009 Miriam Boon Resources
12 Jul 2010
972-v1 iSGTW Status Update March 2010 Miriam Boon Resources
12 Jul 2010
971-v1 GridTalk iSGTW Readership Survey January 2010 Miriam Boon Resources
12 Jul 2010
970-v1 GridTalk iSGTW Readership Survey July 2009 Miriam Boon Resources
12 Jul 2010
969-v1 iSGTW Status Update May 2010 Miriam Boon Resources
12 Jul 2010
968-v1 iSGTW Status Update Apr 2010 Miriam Boon Resources
12 Jul 2010
967-v1 iRODS for OSG, Preliminary Report Ted Hesselroth Storage
Year Four Workplans
30 Jun 2010
965-v2 OSG Annual Report to NSF (June 2010) Paul Avery et al. Governance
30 Jun 2010
909-v5 year4 Key Stakeholder Requests OSG Executive Board et al. Year Four Workplans
29 Jun 2010




-- RuthPordes - 04 Feb 2012

  • OSGSummerSchool1?.jpg:

  • OSGSummerSchool2?.jpg:
Topic attachments
I Attachment Action Size Date Who Comment
jpgjpg OSGSummerSchool1.jpg manage 35.4 K 04 Feb 2012 - 16:13 RuthPordes  
jpgjpg OSGSummerSchool2.jpg manage 44.9 K 04 Feb 2012 - 16:13 RuthPordes  
Topic revision: r3 - 12 Oct 2016 - 15:10:34 - KyleGross
Hello, TWikiGuest


TWiki | Report Bugs | Privacy Policy

This site is powered by the TWiki collaboration platformCopyright by the contributing authors. All material on this collaboration platform is the property of the contributing authors..