Events for Computer Security Awareness Day will be held in Wilson Hall on Tuesday, Nov. 10 from 9 a.m. to 4 p.m.
The first bimonthly newsletter, FIFE News, was published earlier this month
SCD's Bonnie King and CCD's Valerie Higgins were mentioned in a Chicago Tribune article about Fermilab's annual art show.
New desks and chairs arrived in the FCC first floor lobby.
Amitoj Singh and Marcia Teckenbrock discuss Supercomputing arrangements in the new lounge area on the first floor of FCC. Photo: Ruth Pordes
Theodore Schmidt (CCD/Information Systems/Business Infrastructure Applications/Service Now Developer)
Gianluca Petrillo (SCD/Systems for Scientific Applications/Scientific Software Infrastructure/Tools and Advanced Computing)
(5, 10, 15 & 20+ years)
Mark Schmitz- 36 years
Arthur Kreymer- 32 years
Adam Para- 30 years
Thinh Duc Pham- 28 years
Chris Stoughton- 28 years
John Marraffino- 26 years
George Szmuksta- 26 years
Merina Albert- 25 years
Stephen White- 25 years
Guilherme Lima- 24 years
Elizabeth Buckley Geer- 23 years
Brian Yanny- 21 years
Natalia Ratnikova- 15 years
Michael Zalokar- 15 years
Anna Mazzacane- 5 years
Tammy Whited- 5 years
Computing at the next level
Computing receives ISO 20k certification in 2012
Three years after earning ISO 20k certification, Computing employees just finished their first re-certification audit for Service Management. We have been recommended for re-certification!
As explained by Service Manager Tammy Whited, certification means the auditors are satisfied that “we have good practices, processes and procedures in place for how we manage our IT services and how we do business.”
The audit has and continues to push Computing to the next level of efficiency and effectiveness. It serves as a yardstick against which the current processes can be objectively evaluated to determine what needs to be further streamlined and enhanced. ServiceNow improvements, the upcoming cloud-based email system and digital signage are a few changes motivated by last year’s audit.
Continual improvements in Computing helps the whole lab, said Whited. “The audit helps us change our focus from being an IT shop to a service-oriented organization trying to provide solutions that help enable the business of science.”
Ways in which the audit helps Computing contribute to the success of the lab include:
Accountability: “The audit helps hold us accountable by making certain we have processes in place that ensure we are doing what we say we are doing,” explained Service Level Manager Brian Mckittrick.
Agility: “When you understand what you have, it’s much easier to adjust to what people need,” said Whited. “If you don’t know what you have, you might not be the best steward of your resources. You might purchase something you already have or something no one needs.”
Communication: The audit helps place Computing in a better position to communicate with our users to improve mutual understanding while better enabling us to meet our users’ needs.
Stability: The audit ensures Computing has proper risk management processes and documentation in place; these have helped decrease the duration and impact of outages.
Transparency: The audit provides a better view of what is happening in Computing and helps ensure that the expectations are clear.
But certification and the audit process also helps Computing staff directly. “When everyone is speaking the same language and doing the same thing, you don’t have as much chaos,” said Mckittrick. “The improvements we are making should make it easier for everyone to do their jobs better.”
On behalf of the Service Management team, thank you to all who participated in or contributed to the audit and to those who practice service management in their daily work.
CCD's Santo Campione presented at WorkDay's annual customer conference. His presentation, titled "Creating a Sustainable WorkDay Integration Model", attracted 175 attendees.
Marc Paterno presented, “Framework design experience from art” at the Electron Ion Collider (EIC) Software Workshop at Jefferson Lab.
SCD was heavily involved in the recent LArTPC Reconstruction Requirements Workshop on Oct. 19 and Oct. 20. All LArTPC experiments -MicroBooNE, DUNE, SBND, Icarus, LArIAt, ICARUS, Argoneut- were represented as well as projects – including art, LArSoft, Pandora, – contributing to the software and computing needs.
Chander Seghal presented, “Open Science Grid Physics to Campus Researchers” at HEPix.
Now playing in the FCC lobby:
"Fermilab Trigger and DAQ Roadmap," Kurt Biery, CPAD Workshop, Oct.6
"Exascales and Exabytes: Future directions in HEP Software and Computing," Oliver Gutsche, meeting of the Division of Particles and Fields of the American Physical Society, Aug. 6
"Framework Introduction," Marc Paterno, art/LArSoft course, Fermilab, Illinois, Aug. 3.
"Introduction to LArSoft," Erica Snider, art/LArSoft course, Fermilab, Illinois, Aug 3-7
From the CIO: Aligning our IT investments with business strategy
Chief Information Officer Rob Roser
The Federal IT Acquisition Reform Act (FITARA), the most far-reaching IT reform legislation in two decades, was recently signed into law. Its purpose in the federal space is to consolidate IT and give the agency CIOs approval authority for all procurement, workforce and technology-related budget matters. While this consolidation is viewed as an important step toward greater efficiency, one of the unintended consequences of this legislation is that it’s also being applied to the national labs.
Fermilab’s prime contract requires, among other things, that DOE annually assesses our scientific, technological, managerial and operational performance. Computing, in general, and IT, specifically, will now be one of the key elements of our prime contract that Fermilab will be judged upon.
In order to satisfy FITARA and still be able to optimally execute our work, the national lab CIOs, along with DOE, have proposed the following approach.
- Integrate CIO and IT strategy consideration into DOE management and operating (M&O) contract competition evaluation.
- Integrate IT strategy for R&D and operations into the overall laboratory planning and assessment processes.
- Integrate the CIO into existing review processes for major R&D investments.
Why am I telling you all this? Because our ITIL processes and ISO20k certification is more important now than ever. First and foremost, the processes build a foundation, a code of conduct, for how we do business. By its very nature, service management forces us to align our IT investments with business strategy―or more succinctly, we are already doing what FITARA was designed to improve. Our ISO20k certification is an independent validation to the federal government that we are acting properly. Now we just need to document it to DOE to get credit.
That said, we did not go down this ITIL/Service Management path to satisfy FITARA. We did it because it was the right thing to do. ITIL provides a powerful framework that we can leverage to deliver the right services at a cost-effective price point and provides metrics to measure how well we are doing and identify what areas need additional attention.
This week, we are undergoing a full recertification for ISO20k. This is an important milestone for both the lab and us. Thank you to Brian Mckittrick, Tammy Whited, Vicky White and everyone who has been involved in this effort.
Lights, camera, collision!
The Blanco telescope in Chile on which DECam is mounted. Credit: T. Abbott and NOAO/AURA/NSF.
The last frontier is being understood piece by piece as we learn about the particles that make up space. One of the most recently launched discovery efforts by Fermilab, Dark Energy Survey-Gravitational Wave (DES-GW), is the hunt for neutron star collisions detected through gravitational waves. Like any good challenge, there’s a catch― scientists don’t know for certain gravitational waves exist because they have never been directly detected.
Scientists in SCD are helping to aid in discovery through DES-GW, an approximately 3-year collaboration headed by the Particle Physics Division with important contributions from SCD and various universities using data from the Laser Interferometer Gravitational-Wave Observatory (LIGO). The project started in mid-September when LIGO began operations. LIGO detects potential gravitational waves caused by collisions between neutron stars, but not the exact location of the event. DECam produces images that can provide a more precise location.
When LIGO detects a possible gravitational wave, it sends a signal to other organizations including Fermilab. If the source of the gravitational wave fits into or near the area the DECam can observe, SCD’s Jim Annis takes the reins. He creates a probability chart to decide where to take images based on the probability that a detectable collision will be captured by the DECam within the region given by LIGO.
DES-GW will search for flashes of light associated with neutron star collisions by comparing previous DECam images of an area to a new image of the same area. When the images are processed, ones that vary from the original template image are further analyzed to see if the variance fits the parameters of a neutron star collision and corresponds to LIGO’s findings.
The light from these collisions can fade in a matter of days, so it is essential to process the DECam images within 24 hours to give the team time to decide if it is worth taking and processing more images of the same area the next night. In addition, images come in large batches at irregular and unplanned intervals, and they arrive after work hours. To solve these problems, SCD’s Ken Herner developed a way for the images to be processed automatically and in parallel by adjusting the code from a previous project, Supernova, and utilizing a combination of Fermilab and OSG grid resources to handle the extra data processing. “[The image processing pipeline] can run jobs anywhere from day one,” explained Herner. “When you get a new image, it will be automatically submitted to be processed, and the script will figure out which of the previous images overlap with it. You can just sit back and wait.”
Now that Herner has almost completed work on the image processing pipeline and is running test jobs to exercise the system and work out any issues, the DES-GW team is excitedly awaiting a signal from LIGO.
Service Management/Business Analyst
I started working at Fermilab in 1990 as a contractor for the SSC project in the Technical Division. I proudly signed the first magnet transported to the facility with a big magic marker. I was hired as a full-time employee in 1992 and subsequently worked for FESS in Operations and Maintenance, the Safety Office, Time and Materials and then Business Services in the Management Information Systems group, which was later absorbed by Computing.
I am currently the liaison for the Directorate, WDRS and ESH&Q. Since the reorganization of the ES&H Office, I have been busy assisting them with the task of identifying and bringing all ESH&Q systems under one supportable umbrella.
In addition, I am currently working on an effort between ESH&Q and WDRS to “Contract Contingent Workers” via FermiWorks. We have successfully rolled out the functionality to FESS and will work our way through the rest of the lab organizations. What this allows us to do is consolidate all of our workers into one system that manages the same high-level details for contingent workers (non-Fermilab payroll workers) that the system does for Fermilab payroll employees. We are now able to capture information regarding the organizations that bring workers to Fermilab, workers’ emergency contacts and who is responsible for ensuring these workers take the appropriate training. Managers are able to take a proactive approach to extending or expiring badge access and controlling computer privileges.
Outside of the lab, my husband Dennis and I are busy supporting our twelve-year-old’s budding MLB career. We really enjoy watching him push the edges of the envelope and challenge himself in ways that we could have never imagined he would.
Systems for Scientific Applications/Real-Time Systems Engineering/Real-Time Software Infrastructure
Wesley Ketchum inside one of the modules of the ICARUS T600 detector
I am a new addition to the Scientific Computing Division, having just officially started as an Associate Scientist this past September! I’ve enjoyed collaborating with members of Computing over the past eight years that I have worked at Fermilab. I started working at Fermilab as a graduate student at the end of the run of the Tevatron. I worked on data analysis and the online trigger system at the CDF experiment. After finishing my Ph.D. at the University of Chicago, I joined Los Alamos National Laboratory as a post-doc, making the switch to neutrino physics, and began working on using liquid-argon time projection chambers (LArTPCs) to detect and study neutrino interactions and search for new physics.
As part of SCD, that’s what I continue to work on now. I am co-leader of the data acquisition commissioning team and the reconstruction software development team for the MicroBooNE experiment, which has just begun its physics run and is detecting its first neutrino interactions. MicroBooNE will study neutrino interactions on argon using its high-resolution LArTPC, and further the search for a new, “sterile” neutrino by investigating an anomaly observed in the MiniBooNE experiment’s data.
I’m also involved in the development of a number of new experiments, especially the two LArTPC detectors that will join MicroBooNE in its search for sterile neutrinos on the Booster neutrino beamline: SBND and ICARUS. I am working to develop the DAQ for both detectors based on Fermilab’s artdaq software framework, and I am currently working at CERN, where the ICARUS detector is being refurbished and prepared for a long voyage from Switzerland to Illinois!
When I’m not writing new software or playing on a test stand, I enjoy cooking, traveling, photography, cycling, and listening to and playing music.