-- DanFraser - 10 Aug 2010

The full report with links is available at https://twiki.grid.iu.edu/bin/view/Production/WeeklyProductionMeetings

Action/Significant Items:

  • Although the GOC has requested it on multiple occasions, there has been no response from CERN on why the CERN BDIIs were failing. The GOC will keep trying to get this info. (Scott, Tony)
  • At 4:35 EST August 23, it was reported that no RSV_SAM messages had been received since August 20. By 10:30 EST the scripts had been reverted to their original versions and the missing data re-sent. Scott has a link to a draft report below.
  • The phone number needs to be changed for the Production calls as ESNet has transitioned to Reddytalk. (Dan to contact Sarah or Alain)
  • The LHC has now scaled up to collisions of 48 bunches. The plan is to be at ~350 sometime in November.

Attendees:

  • Xin, Armen, Britta, Robert E., Brian, Suchandra, Tony, Marco, Scott, Mine, Derek, Dan

CMS (Burt)

  • Job statistics for last week
    • ~15 khours/day
    • 17465 jobs/day
    • 89% success

  • Transfer statisics for last week
    • ~278TB/day

Atlas (Armen & Xin)

  • General production status
    • LHC is continuing stable data taking, and continuing to increase the luminosity. End of the week the number of the bunches have been doubled to 48 per beam. Yesterday a new record integrated luminosity day. ATLAS production was low at the very beginning of the week and then ramped up to the usual average of 9k running jobs, mainly simulation. Waiting for the next big scale re-simulation of all the existing MC samples, as well as full reprocessing of data samples in September.
  • Job statistics for last week.
    • Gratia report: USATLAS ran 1.6M jobs, with CPU/Walltime ratio of 78%.
    • Panda world-wide production report (real jobs): no report this week (shift captain is on vacation)
  • Data Transfer statistics for last week
    • BNL T1 data transfer rate last week was 200~400TB/day.
  • Issues
    • Open more job slots to OSG users: talked to management at T1, decision coming soon.

LIGO (Britta, Robert E.)

Gratia Reports

  • Last week's total usage: 4 users utilized 36 sites
    • 82478 jobs total (39804 / 42674 = 48.3% success)
    • 1049297.2 wall clock hours total (739699.9 / 309597.4 = 70.5% success)
  • Current week's total usage: 4 users utilized 39 sites
    • 69587 jobs total (46648 / 22939 = 67.0% success)
    • 964151.5 wall clock hours total (854478.8 / 109672.7 = 88.6% success)

LIGO / E@OSG

  • Recent Average Credit (RAC): 2,249,756.08063, Last Week:2,079,012.32340
  • E@H rank based on RAC: 1 (+-0)
  • E@H rank based on accumulated Credits: 4 (+-0)

LIGO / INSPIRAL

  • OSG-LIGO storage task force

Documentation

Statistics and plots

    • Nebraska --> CIT_CMS_T2 transfers
      • 12 dags, transferring 5000 files each, using 10 different gsiftp servers at both ends
      • Submitted Wednesday, still running: data transfers finish fast, md5sum checks sit in remote queue
      • Testing fix for remote_initialdir problem at CIT_CMS_T2
    • Will test Glidein transfer dag to see if md5sum check complete faster

  • Glideins

Documentation, statistic and plots

    • problem with glidein server at ISI, waiting for fix

Grid Operations Center (Scott T. for Rob Q.)

  • A Draft service outage report
  • Rob on Vacation Aug 16th to Sept 27th
    • Contact Kyle (kagross at indiana dot edu) for Support Issues and Scott (steige at indiana dot edu) for Infrastructure.
  • Indianapolis machine room move: Oct. 12
  • GOC Ticket Synchronizer 1.11
  • MyOSG 1.25 (https://myosg.grid.iu.edu)
  • GOC Ticket 1.25 (https://ticket.grid.iu.edu)
  • RedHat security updates on all production machines except BDII servers.
  • External security audit of all GOC machines completed, another to be scheduled.

Engage (Mats, John)

Integration (Suchandra)

  • Finishing up OSG 1.2.13 release
    • Should be release this week
    • Xrootd, and gratia update
  • Next OSG release consists of several small fixes to various components

Site Coordination (Marco)

Note that this report lists the currently active resources in OSG. If a site is down or not reporting it will not be counted. Therefore there may be fluctuations. Each line has the current number and variation from last week in parenthesis. You can find a table with current OSG and VDT versions at http://www.mwt2.org/~marco/myosgldr.php
  • Site update status (from MyOSG as of today):
    • Most recent production version is OSG 1.2.12
    • 90 (3) OSG 1.2.X resources ( 11 are 1.2.12)
    • 4 (0) OSG 1.0.X resources ( 0 are 1.0.6)
    • 3 (-1) OSG 1.0.0 resources
    • 1 (0) OSG 0.8.0 resources

Metrics (Brian)

Virtual Organizations Group (Abhishek)

D0

  • MC Production going well. 15+ million events last week.
  • Important stakeholder request: Certificate entries used for data transfer in D0 workflow are missing in GUMS template. Policy related matter; have asked Doug for input. Discussion ongoing.

Fermilab-VO

  • LBNE expected to register soon.

GLUE-X

  • Encountered problem with VDT update mechanism.
  • Discussion ongoing with JLab.
  • Making good progress on deploying Glide-in submit host at UConn.

General

Security (Mine)

  • We are interested in hearing about the Indiana's security audit results. We have to apply similar tests to all OSG services and resources. If Indiana security team has already done a similar job with GOC services, we may not need to repeat those tests. Scot will email me explaining what tests have been done and which results have been accumulated.
  • The new certificate layout testing has been finished at ITB except for testing dcache. Suchandra says dcache instance is ready so Doug can test tomorrow. In the next IGTF cycle, the new layout will be the default. The IGTF release after that will only include the new layout. Thus, we must transition to the new layout in the next release in production. I will give you more information when the time approaches.
Topic revision: r13 - 30 Aug 2010 - 19:11:42 - DanFraser
Hello, TWikiGuest
Register

 
TWIKI.NET

TWiki | Report Bugs | Privacy Policy

This site is powered by the TWiki collaboration platformCopyright by the contributing authors. All material on this collaboration platform is the property of the contributing authors..