Alma Cataloging User Group - March 2021
March 8, 2021 – Academic Cataloging Group Notes
Attendees: Amy (NDW), Drew (UND), Shelby (UND), Laurie (UNE), Julia (NMI), Kelly (NMY), Ben (NDV), Jasmine, Linda O (UND), Staci, Jenny (NDSU), Tina (NDSU), Lynn (ODIN), Jason (ODIN), Liz (ODIN)
- Comments/Questions? - Jenny received an email message from Ex Libris announcing that Alma may be moving to another server. Shelby also saw the message and noted that there is a big server farm in Moses Lake, WA which Ex Libris uses.
- March release of Alma (March 7, and update March 14)
- New Metadata Editor is turned on as default, most cataloging roles allow to opt out
- Rules are now available in the new ME
- New Metadata editor – Is anyone trying it out? How is it going? June 6 New Metadata editor will become exclusive use. – Ben noticed a strange thing, when editing holdings, the updating seems less smooth, and it will abruptly moves him from one subfield to another. This seems to happen only when editing holdings records not bib records. Shelby – has found that it takes a bit of looking around to find where they have moved something. Add holding record is under “New” not inventory as expected. But that is going to be moved to be under both places, New and Inventory. It is working for Shelby as long as she uses a certain login and Chrome. We would want to hear more about problems like Ben is reporting and others, because we would want to know about them before June.
- When importing records to Alma, MARC record automatically supplies UNDA in the 049 field and UND in the 994 field. IF we are the first library to own this item should we change these fields to reflect local library codes (i.e. NDJA & NDJ)?? – if you feel like it is important add 972 for shelflist code for future reference.
Data Cleanup/Data Review:
- Authorities – From Tina:
How do we want to handle changed/updated LC subject headings?
Here's an example. The LCSH "Armenian massacres, 1915-1923" was changed to "Armenian Genocide, 1915-1923" in October of 2020. (LC changed it after years of advocacy by Armenians, historians, etc. because calling a genocide "massacres" is a way of whitewashing. LC was reluctant to change it for a long time because the government of Turkey would be unhappy about it—it was a whole thing!)
We have a bunch of instances of "Armenian massacres, 1915-1923" that have not been updated. Here's just a few of them:
Children of Armenia : a forgotten genocide and the century-long struggle for justice
Justifying genocide : Germany and the Armenians from Bismarck to Hitler
Survivors : an oral history of the Armenian genocide
(Dickinson, UNDLL, Williston)
Black dog of fate : a memoir
(Dickinson, Mayville, UND)
To my mind, there are two basic approaches we could take to making these updates. One would be to do it using the Alma authorities update functionality in the NZ. The other would be to overlay our NZ records with the current versions of the WorldCat records, using OCLC's service where they send you (on a regular schedule) batches of the records with your holdings on them that have been updated.
During migration when Ex Libris was going to turn on jobs, Shelby asked about the preferred term correction job. Ex Libris said no. Shelby explained that this does not make changes retrospectively on migrated records. She has questions about how it works. Shelby wonders how she would know what records she would have to do something to. How would she know which term corrections have failed?
Shelby showed a configuration setting to activate disable preferred linking job. Shelby is assuming that the preferred term correction job reports would help us to know what needs to be done. She then showed the Authority control task list. In the institution zone it is showing ILL requests, and orders, which are records where there is no point in working on.
Tina pointed out that another approach to accomplishing the same thing, OCLC has a service where anytime an OCLC record that you have your holdings on is updated, you can get on a regular basis a batch file of all of those records. Then you load them and get updates on all the subject headings and anything else that was updated in those records. She thinks that it would need to be done on a network zone level to make any sense.
Tina said the way they received those batch updates was part of their regular OCLC subscription, you just go into collection manager and set it up. Julia and Jasmine said it would be more efficient if ODIN had a subscription and set it up for all libraries.
Shelby and Tina suggest that ODIN get their own account in order to get these records and load them into the network zone. Tina thinks that she still has theirs set up, but they have not been loading the records since migration. Shelby is interested in how that is set up in record manager.
Liz suggested setting it up for NDSU and seeing how it works. Tina said she will share how it is set up.
Preferred Term Correction – decided to turn it on and have the reports sent to the listserv. Julia asked if it would do a mass update so it will automatically update afterwards. We don’t think it will do a mass update, but we think if the term has a binocular on it in the metadata editor, those are linked headings and those will update when a term in the authority file is updated. Shelby explained that has worked with name headings.
- NZ related issues
- Shelby looked into inventory management in the network zone in regard to access to electronic resources among the UND libraries but could also be useful for BSC and NDI students using the same resources. Do we need a meeting for libraries interested in this or should we table for later time?
- Bound with/associated bibs 773 tags for NDSU did not migrate as local to new ALMA instance. ALEPH group they did. -- UPDATE: SF case #00900505 created 11/18/20 for ExL to investigate. NDSU has provided information for the ticket, and we are waiting for a response. No news on the ticket, but NDSU has sent in examples.
- Updating 035 OCLC numbers when OCLC changes – UPDATE:
- Liz and Jason changed the import profile back to merge (12/15).
- Appears import profile is matching on the “old” OCLC number but not updating it. Has anyone encountered this since last meeting? – Shelby watched a session in ALA where a library set up a python script to update records’ OCLC numbers. Shelby has encountered times where she is not finding a record when she is looking for a new number because the record has an old number. It would be nice to find a way to keep numbers up to date.
- Liz has set up the integration profile in order for OCLC Connexion to export directly to the NZ. – Tina, Julia and Shelby have tested it. It is one of many ways to get records into the system. Rationale is you need records faster than 1-2 days. Liz pointed out that it is nice to know immediately that the record comes in. It is still an issue that records are not updating via the import profile.
Shelby has a number of staff happy with F8, on the other hand they run into situations with government documents records already in the NZ and the new record will not come though. When placing an order before they even have the item they have a record. Sometimes Shelby updates the record on OCLC and brings it in via search in the metadata editor, but that probably bypasses the normalization rule to remove the unwanted fields. There are a lot of ways to records into Alma.
Jasmine said, Alma is a backslide from Aleph. Faster would be nicer, and batch import would be fantastic.
Pending/ Old items:
- MDE slow in New UI – Reported to ExL in SF#00895869 – UPDATE: Shelby has update – Thinks she has this solved, it is because she and another software that is interfering with ME.
- What topics would you like to see covered in documentation, training seminars or next ODIN Work Day? Please submit to email@example.com or firstname.lastname@example.org UPDATE: Ginny is working on Work Day(s) agenda. Tentatively sessions should start in January. Sessions are expected to be 30-60 minutes in length and presented a few in a week. Watch for more details as they are finalized but please send in requests for sessions you would like to see AND volunteer to present sessions you can share!
- Email to list 1/6 there is a problem with the “Link a set of records to the Network Zone” job. When a record matches an NZ record it makes a duplicate record and appears to make the inventory disappear. Update: Issue is not resolved yet, SF ticket 920222, we received an update from MNPALS listserv (2/4) that running the “Recalculate Local Resource Types” job after linking causes reindexing that fixes the issue.