A wonderful experience

Destacado

It was intense and inspiring. I would love to return to take other courses on new interesting topics. Thank you all!

cropped-main-mysite.jpg

Unforgettable

sunset

Sunset from Scripps Institution of Oceanograhy

from 3

From 3

Inside Scholarly Communications

Course chair
cameron
Cameron Neylon, PhD
Professor of Research Communications, Centre for Culture and Technology,
Curtin University, Australia


What to expect


An overview of the Scholarly Communications landscape of today: how we got here, the current state of the field and how it is changing.


Key topics


  • The history of scholarly communications.
  • The economics of publishing.
  • Data availability.
  • Issues surrounding peer review.
  • Major trends for the future.

Practice and Takeaway


1. When is something published?  Functions of the scholarly communication system

Sketching the landscape:
The process of publishing involves many actors and processes.

sketching.jpg

  • State and standing as a dynamic alternative:
    Our metadata systems and approaches are rooted in two assumptions that don’t hold very well today:

    • That views about the value standing of an object are universally held across scholarly (and non-scholarly) communities.
    • That there is a clear final short version of a research work which is the formally published form of the version of record.
  • Standards are usually focused on the needs and interests of publishers, not of libraries or researchers.
  • Metadata schema are a great place to look for assumptions about what matters and who cares.

2.  Opening the black box of scholarly communication funding.

Understanding the complexity of financial movements in scholarly communications and the importance of transparency in understanding such flows.

  • Different sources of funding circulate in the service of publishing.
  • Where the money goes: to publishers!
  • Financial opacity: the lack of publicly accessible financial information is problematic:
    • Prevents from getting a detailed view how much money is paid (in a country, academic sector, universities, libraries, individual researchers).
    • Weakens the negotiation power of universities and libraries in the scholarly publishing market.
  • Analyzing the presented model (Lawson, S., Gray, J. & Mauri, M., (2016):

financial-flow-e1534222124545.png

  • The model doesn’t include the payment of support back from publishers to e.g. scholarly societies as part of the publishing deal.
  • It is a model of charges not of costs.
  • It doesn’t include individuals as potential holders of money and purchasers.
  • “National” negotiations might be consortial or might be regional rather than national.
  • There are a range of other important service providers besides publishers that sometimes sit between institution and publisher and sometimes downstream (subscription agents, aggregators and others).
  • It doesn’t explicitly capture flow of money out of the system (e.g. to publisher shareholders, or into other parts of the research system, other than publishing).
  • It doesn’t explicitly capture flows of money from publishers back to institutions or researchers (editors, scholarly societies).
  • The institution and the publisher might be the same organization.
  • Separate ‘private’ into corporate and individual donors?

Case Study: Mexicomexico

    • Majority of funding is from public funding (Consejo Nacional de Ciencia y Tecnología – CONACYT): approximately US$6.3 billion in 2014, with push to increase science funding.
    • Negotiating body is very strong (Consorcio Nacional de Recursos de Información Científica y Tecnológica – CONRICyT): Statement – First Consortium Assembly from Ibero-America and the Caribbean.
    • Key recommendations from this Statement: It is recommended to insist, as part of every partnership policies, that annual increase above 3% to not be accepted. Agree that an OA expansion policy, through the payment of APC fees, is impossible to undertake from a financial point of view for the participant countries. To not create grants to pay a publication in OA-APC magazines is recommended to the institution.

3.  Main issues of peer review as currently practice:

ref

  • Peer review is seen as the central defining characteristic of the scholarly literature.
  • However, it is a very variable and opaque process.
    • Reviewer bias.
    • Pool of reviewers not keeping tread with the demand. How many carry out rigorous review processes?

Open South: The Open Science Experience in Latin America and the Caribbean

Course chair
gimena
Gimena del Rio Riande, PhD
Researcher, Instituto de Investigaciones Bibliográficas y Crítica Textual (IIBICRIT), National Scientific and Technical Research Council (CONICET), Buenos Aires, Argentina.

Instructors
wouter
Wouter Schallier
Chief, Hernán Santa Cruz Library, UN/ECLAC (Economic Commission for Latin America and the Caribbean, United Nations), Santiago, Chile.

april
April M. Hathcock
Scholarly Communication Librarian, New York University.

dan
Daniel O’Donnell, PhD
Professor of English, University of Lethbridge, Canada.

OS1


What to expect


  • Key arguments for why open and public access to information is important for science and for citizenship both on a local and global scale.
  • To discuss concepts of Open Science, Open Data, and Open Access particularly as they are practiced in Latin America region.
  • A practical approach to deal with the “different open accesses in the world.”
  • The potential of North-South collaboration in building and managing projects in open scholarship.
  • Strategies for effective project management: “from great ideas to great projects.”

Key topics


  • Open Science, Open Data, Open Access
  • Open Scholarship, Public Scholarship
  • Access to information
  • North-South collaboration
  • Project management

OS2


 

Practice and Takeaway


  • Access to information plays a critical role in supporting development.
  • Some countries in LAC region have shown real advances in terms of national laws that seek to make scientific knowledge produced with public funds a common good managed by the academic community (Mexico, Argentina, Peru, Brazil).
  • There are also regional projects such as SciELO and Redalyc that have played an important role in making the scientific production published in Ibero American and Latin American journals available (and recognized).
  • Nevertheless, Open Access in Latin America and the Caribbean still faces challenges that need to be tackled in order to consolidate the model and to make it fully interoperable with global Open Access models.
  • LAC region has significant infrastructural, financial and communication constraints and barriers to build a global open scholarship.
  • There is a need to raise awareness of research and data management and implement institutional policies.

What would we ideally want to see in open scholarship in terms of infrastructure, tools, partnerships, etc.?

How can we bring the state of global open scholarship from where it is now to where we want it to be?

How can we begin to address these constraints?


Projects presented:
Good practices, examples of institutional policies and practical recommendations from Europe and Latin America and the Caribbean.

1. LEARNLEaders Activating Research Networks: Implementing the LERU Research Data Roadmap and Toolkit.
This project has created resources to help research performing institutions manage their research data. 

2. Open Access Infrastructure for Research in Europe (OpenAIRE)
Enabling researchers and universities involved in Horizon 2020 projects to comply with the European Commission open access policy.


Formulation of a new project (in collaboration with Ricardo Hartley, Chile):

Nature: Metrics related to Open Science; access to scientific literature and supplementary data.

Objective: To perform a formal, systematic and periodic monitoring of the accessibility of research publications produced in LAC countries (open research articles, open books, etc.), internationally visible (indexed in WoS or Scopus).

  • To develop a battery of metrics related to open publishing in LAC countries.
  • To generate a ranking of LAC institutions according to this metrics.

Variables / Focus

  • License that governs the use of published articles: gold, bronze, green (accepted, published).
  • Affiliation of principal researcher.
  • Presence of supplementary data (raw data, processed data).
  • Differences across disciplines, language of publication.
  • To analyze sources of publication (national, regional or “international” journals)
  • Correspondence (or not) between the production of articles by disciplinary areas and the presence of supplementary data.

Location:

Funding:

Attendees: universities, national S&T bodies

Partners: ECLAC, OPS, MERCOSUR

Barriers:

  • Lack of funding.
  • Value that the community gives to international rankings (QS, SCimago)

We expected to build a bridge between the North and the South in terms of Scholarly Communication. 

We learnt it’s difficult but not impossible. 

We should definitely devote more time to hands on work.

Walking the Walk: Promoting and Maintaining Best Practices in Fair and Open Evaluation

 

Course chair
StefanTanaka1
Stefan Tanaka, PhD
Professor of Communication, University of California San Diego, USA

Instructors

dan
Daniel O’Donnell, PhD
Professor of English, University of Lethbridge, Canada

a-swift-profile
Allegra Swift

Scholarly Communications Librarian, UC San Diego, USA

De roure
David De Roure, PhD
Professor of e-Research at University of Oxford, UK


Description


As we are building tools, changing infrastructure, and creating new ways to disseminate and share scholarly output, we have been less focused one of the most difficult (because it is cultural) issues – the institutional reward and recognition system, including policies and practices regarding merit, promotion and tenure.

Current systems of evaluation and reporting remain entrenched in ages-old practices while falling further behind advances in scholarship, research methods, publication, impact measurement and reporting. Often uncertainty and misinformation are circulated through evaluators and the evaluated alike, and barriers to the evolution of scholarly communication make it more difficult to entice and retain the best and the brightest.

This course will unpack current practices. It will discuss official university requirements, examine best-practice statements of various disciplines, compare the varied application of these policies and explore strategies for updating promotion practices.


Key topics


  • Academic policy manuals
  • Minefields
  • Institutional (e.g. scholarly associations) statements
  • Common practices
  • Examples of positive practices
  • Metrics

What to Expect


  • Acquire tools for distinguishing policy from rumors and traditions.
  • Become familiar with organizations and institutions promoting fair and open evaluation.
  • Looking for strategies to move these conversations forward.
  • A straightforward overview of metrics that could be used.
  • Thinking about ways to rebuild the system after we blow it up.
  • Practical approaches to help reshape the landscape.
  • Actionable next steps that I could undertake at my home institution to be able to be part of this discussion and push open.
  • Looking to learn from different experiences and perspectives from faculty.
  • Implementing CRIS/RIMs in the Dominican Republic and at these beginning stages.
  • Elsevier, evaluation and people judging others on their merits, who gets to determine what is valuable and useful.
  • Running research IDentifier program, what are the incentives, getting promoted on better science instead of current structures.
  • What can librarians do on this space, how can we support faculty in meeting the requirements, whatever they may be.
  • Impact metrics for data: Open data, how to make it reliable and productive in evaluation. Combination of different methods to gather and communicate impact of datasets.
  • Research outputs aren’t always traditional and not looked at seriously. Comparison of other disciplines.
  • Value faced approach to P&T. non traditional forms of scholarship might be evaluated. Service bucket, peer review and mentoring.
  • Looking at diversity in these processes.
  • Defining the process, if the current behaviors are driving bad science what would drive good? Value based criteria for the scholarly commons. FAIR data

Practice and Takeaway


Political and cultural landscape

  • Pinch points and common practices:

This is where finger pointing often hinders understanding: different interests blame others.  Points where status quo is maintained (often inadvertently): they range from uncertain faculty, departmental cultures, university-wide review committees, and administrative practices.

mla_logo
Guidelines for Evaluating Work in Digital Humanities and Digital Media

***More focused on content, less on the metrics***

***Impact beyond academia***refref2


Promoting diversity of practices

Updated practices and guidelines for open research. Many disciplines have issued or updated statements that recognize the changing digital landscape.

cos_logo   TOP guidelines 

The TOP Guidelines were created by journals, funders, and societies to align scientific ideals with practices

Transparency and Openness Promotion guidelines: eight modular standards, each with three levels of increasing stringency. Journals select which of the eight transparency standards they wish to adopt for their journal, and select a level of implementation for each standard. These features provide flexibility for adoption depending on disciplinary variation, but simultaneously establish community standards.

top-guide.png


LOGODORA   San Francisco Declaration On Research Assessment

Leiden1   Leiden Manifesto

  1. Quantitative evaluation should support qualitative, expert assessment.
  2. Measure performance against the research missions of the institution, group or researcher.
  3. Protect excellence in locally relevant research.
  4. Keep data collection and analytical processes open, transparent and simple.
  5. Allow those evaluated to verify data and analysis.
  6. Account for variation by field in publication and citation practices.
  7. Base assessment of individual researchers on a qualitative judgement of their portfolio.
  8. Avoid misplaced concreteness and false precision.
  9. Recognize the systemic effects of assessment and indicators.
  10. Scrutinize indicators regularly and update them.

What is useful that you can use at your institutions?

  • As an early career researcher: Advisor or other faculty mentor who understands open practices, can provide an example of how you can be open.
    • Librarian: Could respond by linking researcher/faculty/student with mentor, by providing examples of how to integrate and justify openness
    • Students can join disciplinary organization
    • Students need support learning how to be an advocate (for self). Scripts could help.
  • By exposing ourselves to the language of faculty around these issues, can perhaps be include ourselves in a space where we/librarians are typically excluded
  • Reminder that the history of the impact factor was to sell stuff to librarians. Reflect on how it was never meant to be used this way.
  • Hook people with a conversation that is of interest (predatory journals) and then expand to broader scholarly communication landscape
  • Just in time-formatted content: videos

What is a problem that you find most difficult?

  • Librarians are routinely excluded (from conversations, from related literature)
  • Numbers, rankings are still the dominant value system
  • Getting librarians up to speed in a changing landscape to take on these discussions
  • Need for explicit guidelines/framework for mentoring within this space

What did we learn?

  • It is really complicated and the systems in place are really entrenched.
  • San Francisco Declaration of Research Assessment (DORA), a framework for developing a more holistic approach to assessing research.
  • That incremental change is the best that can be hoped for, and that a neat segue is probably needed to do so.
  • An argument against using JIFs for assessing research.
  • Publications must be evaluated, not merely enumerated.
  • Considering different types of outputs, documents.
  • Hard to arrival at a practical approach because we are still in a mode of discovery. And this is an issue that is very locally-structured, so you can look to practices adopted elsewhere but there is no one path forward.
  • Reminded how numbers feel more fair, less biased, but can be abused just like anything else.
  • Reminded to be suspicious of incentives that incentivize what people should be doing anyway.
  • Idea of preparing a script/talking points for when someone brags about their h-index, promotes the value of the h-index.