Tuesday, December 23, 2008

Update: Andrew Stammer's response to my notes from his presentation at the APSR Open Access Publishing Workshop

On Monday, 8 December 2008, I blogged my notes from the Australian Partnership for Sustainable Repositories (APSR) Workshop entitled, Open Access Publishing: A PKP User Group Workshop, which I attended in Sydney on Thursday 4 December 2008.

In my notes, I focused on the presentation given by Andrew Stammer, Journals Publishing Director at CSIRO Publishing. Today I received an email from Andrew, responding to my post. Andrew has kindly permitted me to post his comments in full:

There are two points I’d like to explore further with you.

1. I did not say it was part of a publisher’s role to lobby. I did say that in response to the challenges presented by Open Access this publisher is doing the following:

  • Striving for quality in content
  • Striving for quality in delivery
  • Promoting what we do
  • Nurturing the relationships
  • Offering OA options
  • Lobbying
  • Engaging in the dialogue

Publishers do lobby to gain influence, just as proponents for OA lobby to gain influence. It’s how the game is played.

2. Costs of producing a journal also astonish me. You may be sure that if we could publish more cheaply, without compromising quality, we would. Your suggestion to try printing on demand is sound – indeed that is what we do. It is just that the demand it great, so we have to print a lot of copies. Our journals are available online as well as in print. Subscribers pay a significant premium to receive print in addition to the online version. I wonder why they keep ordering print, but they do, and so we supply it. Some publishers have forced the issue by stopping print versions. The American Geophysical Union is an instance of this, they will cease print in 2010. I know of another publisher that ceased print and suffered an erosion of subscriptions as a result. Our approach has been to let the subscribers decide. Perhaps difficult economic times will force this issue.

Monday, December 8, 2008

John Willinsky – The Open Access Advantage for Research: It’s more than the Price

My notes from John Willinsky’s talk at the APSR Open Access Publishing Workshop (they are a bit rough):

What is important = Mobilisation of knowledge

Background on PKP – a humble project with humble origins
Business model = PKP gets grants for research projects, and in the course of that research they develop software
John started out in 1998 with this out of frustration as an educator – his primary goal was being able to excite kids about knowledge
Once he become Prof – goal was exciting teachers about knowledge
Frustrating as an educator to not be able to share knowledge with students except when they were at the university, on campus, talking and accessing the material that the university pays for. But once they leave the university they have nothing – library cards are taken away – essentially told their thirst for knowledge must end once they leave the university
So why are we teaching people to be interested in learning and knowledge if we don’t make the knowledge available to them?

Consequences of the serials crisis was about access to knowledge in a fundamental way
Internet was exciting and filled people with hope because suddenly access was possible

Principle of the 1990s – asking, how are we going to make this work? On what basis can we circulate this?
People didn’t wait for an answer
They just started putting things up
And discovered that they got traffic; there were readers; people were interested

PKP started in 1998 – but realized they weren’t going to be able to convince people to put their journals online and make them free
Because then people were asking, what will it cost?
And we are still trying to answer that

Now, John proposes that this is a misleading question
He is not asking us to reconcile how some journals are publishing entirely with a zero budget while other journals have huge expenses
Different journals work on different economies, habits etc
All PKP wanted to do was contribute to this
Another piece to a complex puzzle
A piece that said, YOU can decide how much this will cost – will you use open software or not, will you have volunteers or paid work…etc?
There are some journals that have always been run on nothing more than the enthusiasm of academics and they deserve the proper software for this
That is why there is such a DIY focus to the software
And why PKP uses open source software

Business model for open source software = people don’t pay for software but they contribute to the software and build a community of users
This sounds a lot like scholarship
The software and the publishing model could come together

Saw value of work not in its profit or loss, but the value in its circulation

PKP continues to be funded from research grants, principally from Canadian Government

Want to provide a choice for the academic community – academics deciding how they want to shape the products they are creating


1. Research e.g. – Open Medicine

Series of editors for Canadian Association of Medicine were fired because Association was not happy with the direction they were taking with the journal ($$)
Board resigned because they were upset the editors were fired
PKP offered the free software as an opportunity
Not fair that CMA and pharmaceutical companies could interfere with the research being disseminated
So these formerly well paid editors decided to form an open journal called Open Medicine
Agreed they would not accept medical advertising, and they would make their content available immediately for free
Has been difficult
Now on the brink on being indexed
Established for themselves the possibility of running a journal on a different economy
Interesting academic freedom questions – not just about the money
About keeping universities and academics at the forefront of what makes research daring
We need to see the libraries of our universities as public institutions

2. Eg – research on registered massage therapists

It was discovered that registered massage therapist they were entirely engaged in the research in PubMed, and frustrated that they could only see 15% of research that was available. They were not prepared to tell their clients about the research unless they could see the methodologies – interesting that this group would raise their standard of care in accordance with the open access of material

3. E.g. – Wikipedia

Finished a study on Wikipedia as a public point of entry
Fact that so many people are coming together to discuss and debate knowledge is itself a public good
Asked how much research is being included in wikipedia
Compared with Stanford Encyclopedia of Philosophy – peer reviewed, free, in text links, etc
Found that 80% of entries in Stanford E of P are being cited in Wikipedia [me: that’s pretty cool]
Wikipedia may not be accurate about some things, but it is accurate about Aristotle, Descarte, the meaning of life – to the extent that it is citing peer reviewed literature
In a 2 week period, ¾ of the 80% cited were viewed and discussed
180 000 references to peer reviewed literature in W (but most of it not freely available)

Where are PKP going next?

Open access = public access

(1) Monograph is one of the most exciting developments moving forward
What we have done with journals has only harmed monographs
“shrinking monograph budget”
The thoroughness of an argument in a monograph is an important intellectual property that we cannot allow to disappear
If we discourage people from thinking in book length thought than the quality of research will decline

Want to create a digital option with print on demand for monograph
Want to use the same principles of OJS and OCS
Monograph system for monographs that would not otherwise get published – e.g. studies show that if you are working in Latin American history at an American university, then you will not get your monograph published

Want to bring in some ideas:
1. Bring back the great ideal of a wonderful editor – using the social network
2. Want to build an incubator where authors can start to put their work and where an editor could say – “I think this looks promising – let’s see if you have a book here”
3. Encourage people to think about whether they have a book or not and start developing it at an early stage
4. Make sure people are aware of what is already out there
5. Mainly about conceptualization – the monograph is something that needs an alternative approach to – the universities have set the parameters of what they are willing to publish as a monograph and we think that it is unacceptable – need to foster more extensive work
6. Want to create a publication area that lasts for longer – build a place for the book to be a permanent part of a growing culture – all comments received in the incubator become part of the book’s development

(2) how can we create better quality layout in a way that costs less
product = lemon8-xml
like a great copy editor
e.g. can compare bibliography with bibliographies in PubMed and correct mistakes

(3) Working with Dataverse Network (out of Harvard with Gary King) to make data citable and will give the data a “thumbprint” so that if people download the data and use it and don’t credit you, then you can find them

APSR Open Access Publishing: A PKP User Group Workshop

On Thursday 4 December 2008, I attended the Australian Partnership for Sustainable Repositories (APSR) Workshop entitled, Open Access Publishing: A PKP User Group Workshop.

PKP is the acronym used for the Public Knowledge Project, a research and development initiative directed toward improving the scholarly and public quality of academic research through the development of innovative online publishing and knowledge-sharing environments (see “About the Public Knowledge Project”). PKP was founded in 1998 and is located at the University of British Columbia and Simon Fraser University in Canada and Stanford University in California. PKP has developed Open Journal Systems (OJS) and Open Conference Systems (OCS), open sources software for the management, publishing and indexing of journals and conferences.

Professor John Willinsky, Director of PKP, Professor of Education at Stanford University School of Education and author of, “The Access Principle: The Case for Open Access to Research and Scholarship” came out to Australia for the workshop, as did PKP Developer, MJ Suhonos. My notes from Professor Willinsky’s plenary address appear in this post.

The workshop was held at the University of Sydney and continued on Friday 5 December. I was unable to attend on Friday, but my colleague, Professor Anne Fitzgerald of QUT Law School, gave a presentation entitled, “Constructing open access by effective copyright management” and QUT’s DVC, Professor Tom Cochrane, spoke on “The Institutional Perspective on Open Access – dos and don’ts”. The full program can be viewed on APSR’s website.

My notes from Thursday follow.

The workshop was primarily focused on users’ experiences with PKP software. So we heard from Eve Young, Helen Morgan and James Williams from the University of Melbourne, Bobby Graham from the National Library of Australia and Susan Lever, Editor of the Journal of the Association for the Study of Australian Literature about their experiences with using OJS and from Peter Jeffery of the Australian Association for Research in Education (AARE) about using OCS. Generally the feedback was very positive (especially for OJS) but some suggestions for improved usability (particularly for non-tech savvy academics) were also made. Susan Lever spoke about the exciting opportunity that online publishing offers where articles can contain in-text live links to other sites offering additional information, images and videos, which greatly enrich the experience of the reader.

The university ePress was also a topic of the day. Lorena Kanellopoulos informed us about the management and operation of Australian National University (ANU) ePress and Dr Alex Byrne spoke about University of Technology Sydney (UTS) ePress. UTS ePress publishes the journal, Portal, which I believe was the first journal to be published in Australian using PKP software. The main point to come out of Lorena and Alex’s presentations, to me, was that university ePress costs were not high and that universities can publish their own journals, using open source software and a “publish online with a print-on-demand option” approach, successfully and cost-effectively. Dr Geoffrey Borny, Visiting Fellow in the School of Humanities, College of the Arts and Social Sciences and Member of the Emeritus Faculty at the Australian National University, gave a personal account of what it was like to publish a book with ANU ePress. He was a very happy customer, saying that ANU ePress was efficient and professional, and that publishing online had given him much wider exposure than he expected.

For me, however, the most interesting presentation of the day (aside from Professor John Wilinsky’s plenary address, which is covered in a separate post) was from Andrew Stammer, Journals Publishing Director at CSIRO Publishing. As Andrew pointed out, the CSIRO Publishing Charter creates an interesting creative tension between CSIRO Publishing’s commercial role and public interest role by stating that CSIRO Publishing is to:
  1. Operate within CSIRO on a commercial basis with its viability entirely dependent on the capacity to generate revenue and sufficient return on investment (i.e. CSIRO Publishing must fund itself – it apparently receives no funding from CSIRO or the Australian Government); and
  2. Carry a national interest publishing obligation on behalf of CSIRO within this commercial role.
Despite not agreeing with everything that Andrew had to say (I was highly amused to see that he included “lobbying” amongst the publishers’ roles, right up there with “striving for quality in content” and “nurturing relationships”), I thought that his presentation was remarkably well balanced. He spoke about the OA initiatives of CSIRO Publishing, including the publishing of an OA journal – The South Pacific Journal of Natural Science. He explained the publishing process, being that publishers:
  • Acquire content;
  • Review and develop content (facilitate peer review);
  • Prepare content for dissemination;
  • Disseminate content; and
  • Promote content and authors.

Andrew also spoke at length about the costs associated with publishing. And these costs seemed quite incredible to me. For journal publishing of 1162 pages, across 108 articles in 12 issues, printing alone costs $43,166. This cost is quite distinct from costs associated with layout, peer review, promotion or even postage (postage additionally cost thousands of dollars). Much of these costs, I think, could be avoided or massively reduced by online dissemination and print-on-demand services.

Yet what really jumped out at me was a graph that Andrew displayed, which he had acquired from the journal article: Rowlands I and Nicholas D (2006) The changing scholarly landscape, Learned Publishing, 19, 31-55. He showed this under the heading, “What do authors want?” and I was only able to quickly scribble down the order in which the items appeared:
  1. Reputation of journal
  2. Readership
  3. Impact factor
  4. Speed of publication
  5. Reputation of editorial board
  6. Online ms submission
  7. Print & electronic versions
  8. Permission to post post-print
  9. Permission to post preprints
  10. Retention of copyright.

Being a lawyer and an advocate that authors retain copyright in their works and only issue their publisher a Licence to Publish, I was rather concerned about “retention of copyright” being last on a list of “what authors want”.

On Friday morning, I looked up the journal article online, it’s full citation being: Rowlands,I., Nicholas,D. (2006). The changing scholarly communication landscape: an international survey of senior researchers. Learned Publishing 19(1), 31-55. ISSN: 0953-1513.

The article presents the results of a survey “on the behaviour, attitudes, and perceptions of 5,513 senior journal authors on a range of issues relating to a scholarly communication system that is in the painful early stages of a digital revolution” (p31). The survey was conducted by CIBER, “an independent publishing think-tank based at University College London” (p31), in early 2005 and was commissioned by the Publishers Association (PA) and the International Association of Scientific, Technical and Medical Publishers (STM) with additional support from CIBER associates. I was somewhat skeptical about the survey being commissioned by two publishing bodies, but the article’s authors assure readers that:
The views expressed in the Report and in this article are those of the authors alone, based on the data. They do not represent a corporate position, either of the PA or STM. The survey was conducted in a totally unbiased fashion; the research team (CIBER) has no allegiances other than to the data (p33).
The graph in the article is labeled “Figure 7 Reasons for choosing last journal: averages, where 5 = very important, 1 = not at all important (n = 5,513)” not “what authors want”. The actual figures in the graph were –
  • Reputation of the journal – 4.50
  • Readership – 4.21
  • Impact factor – 4.04
  • Speed of publication – 3.89
  • Reputation of editorial board – 3.55
  • Online manuscript submission – 3.43
  • Print and electronic versions – 3.21
  • Permission to post post-print – 2.58
  • Permission to post pre-print – 2.34
  • Permission to retain copyright – 2.31

In my opinion, the reasons why an author may have chosen to publish in a particular journal in the past are not necessarily indicative of what may influence them to publish where in future, especially in this very changable environment of academic publishing. Yet it is still somewhat concerning to see permissions to post pre and post print versions of the article and to retain copyright rate so low.

The question must be asked why the survey results may have shown these preferences. I think it is important to point out that this survey was undertaken in 2005, so does not reflect the most current state of affairs. Additionally, the authors identify the age of the survey respondents as being a potential influencing factor:
More than a third (35.9%) of the respondents are baby boomers, aged 45 or older, and many of their attitudes will have been formed during a long period of relative stability for the academic sector, at a time when the current difficulties facing institutional library budgets and the scholarly communication market were not so evident (p37).

The authors also write:
Many spoke of the influence of external measures, like impact factors, in determining where they feel they have to publish, sometimes to the detriment of their readers (p41).

However, “readership” and “speed of publication” rated almost as highly as “reputation of journal” and “impact factor” – features which I would argue could be delivered quite effectively by OA journals, even relatively new ones.

My final point in relation to this article is that I perceived an implicit bias against OA publishing, despite the authors’ claims to the contrary. This I perceived from the phrasing of questions with a negative slant (for example, “How disruptive is open access?”) and from comments such as this:
There is a significant relationship between previous experience of publishing in an open access environment and researcher’s attitudes to the value they attach to peer review. Authors who have published in an open access journal are more likely to attach lower value to the importance of peer review (p44).

To me, this statement implies that OA journals do not necessarily use peer review or value peer review, which is simply not true.

Notwithstanding my opinions about how the results of the survey are presented, the article is an interesting read. The OAK Law Project has also conducted its own survey, in 2007, on the attitudes and practices of Australian academic authors in relation to the publication and dissemination of their research. The survey report can be accessed here (or by direct link to PDF).

Friday, November 21, 2008

Seminar: Towards a National Information Strategy

“Australia is behind many other advanced countries in establishing institutional frameworks to maximise the flow of government generated information and content” – Venturous Australia: Building Strength in Innovation.
On 19 November 2008, I participated in a free public seminar about the Review of the National Innovation System: Towards a National Information Strategy. The half-day seminar was held in the Hyatt Hotel in Canberra and was hosted by the Department of Innovation, Industry, Science and Research and the QUT Law School.

The speakers at the seminar included Professor Brian Fitzgerald and Professor Anne Fitzgerald, both IP professors in the QUT Law School, and Dr Nicholas Gruen of Lateral Economics. You can view the seminar agenda and speaker bios here.

Professor Brian Fitzgerald spoke about innovation as a force that results from the exchange of ideas. He said that collaboration was a key methodology for innovation. Professor Fitzgerald referred to statements made earlier this month by Finance Minister Lindsay Tanner when he said, “The rise of internet-enabled peer production as a social force necessitates a rethink about how policy and politics is done in Australia”. (Reported in the IT section of The Australian). Professor Fitzgerald spoke about how we need to move from a “gated” model of information distribution and knowledge creation to an access based model. He said, “By sharing IP we can harness a powerful new force – mass collaboration”. He also noted Barack Obama’s technology policy, which promotes openness of the internet and openness in government and research.

Dr Nicholas Gruen gave a compelling talk, very similar to his talk given at the CRC-SI Conference this year (see my earlier post). I like the way he defined innovation as “fragility in the face of serial veto” or “fragility amongst robust hazards”. He also gave his own interpretation of the current financial crisis – “The world has created the perfect storm designed to show us the importance of managing information.” One of Dr Gruen’s examples (there were many) of how small amounts of data or information could be used to vastly improve the lives of Australian citizens was what he called the “windows on workplaces” scheme. The idea is this: increasingly, it is becoming important to Australians to have a work/life balance. There are many workplaces that claim to offer a work/life balance, but in reality many do not. And currently there is no way for people to find out the true state of affairs until they actually start working for the company in question – and usually end up working long hours and missing social/family engagements. Wouldn’t it be easy, Dr Gruen says, to ask people to answer a few simple questions – this could be done when ABS is collecting census data – about whether or not their workplace actually delivers on their work/life balance promises? Then workplaces could be ranked according to what they actually provide – not just what they claim to provide – which would create proper accountability and incentives for workplaces to deliver on their promises. The scheme is simple and cheap, but if successful it could have an enormous impact on the lives of working Australians.

Professor Anne Fitzgerald spoke about policy developments in Australia and around the world on access to and reuse of government data and information. These policy developments are charted in a literature review that Professor Anne Fitzgerald is currently undertaking, entitled, Policies and Principles on Access To and Reuse of Public Sector Information: a review of the literature in Australia and selected jurisdictions. (See my earlier post on this).

I gave a brief overview of the research we have conducted in the area in the QUT Law Faculty. I also spoke about Professor Anne Fitzgerald’s literature review, and our new website about access to and use of public sector information (see my earlier post). My powerpoint presentation can be accessed here.

Overall, it was a very successful and informative seminar.

It was also great to hold the seminar in Canberra. Not only did it enable us to engage with many federal politicians, but we also had the afternoon to look around this lovely city. I visited the National Gallery of Australia, the High Court of Australia and old Parliament House, and had a grand old time before my flight back to Brisbane.

New: literature review and website on access to public sector information

Professor Anne Fitzgerald of the QUT Law Faculty is currently undertaking the massive task of reviewing the literature around policies and principles on access to and reuse of public sector information in Australia and worldwide.

The literature review is divided into chapters according to jurisdiction. This is an ongoing project and Professor Fitzgerald will be releasing the literature review in installments as each chapter is completed.

She has just released Chapter 1: Australia and Chapter 2: New Zealand. Currently, these chapters appear together in PDF form, but I believe they will appear separately later. The literature review so far is extremely comprehensive – chapters 1 and 2 alone comprise 268 pages!

Forthcoming are the remaining chapters – Chapter 3: International; Chapter 4: Europe, UK and Ireland; Chapter 5: United States and Canada; and Chapter 5: Asia.

Currently, the literature review is available in the QUT ePrints Repository (here), but it will soon appear on the new website: http://www.aupsi.org.

http://www.aupsi.org is the website of a new research group with which I am involved – Access to and Use of Public Sector Information (auPSI). auPSI’s mission is to provide a comprehensive web portal that:
  • promotes debate and discussion about the re-use of PSI in Australia and more broadly throughout the world;
  • focuses on developing and implementing an open content licensing model to promote access to and re-use of government information;
  • develops information policy products about delivering access to and encouraging the re-use of PSI;
  • keeps users informed about international developments in this area; and
  • assists governments and policy makers on the development of appropriate policy about the creation, collection, development and dissemination of public sector information.
This mission is built on achieving the following three objectives:
  1. greater efficiency in the reuse of PSI throughout the world;
  2. leading to better quality of outcomes;
  3. for greater impact of publicly funded knowledge within our society.

The literature review will be released in full on this website, as will a forthcoming article by Neale Hooper, Timothy Beale, Professor Anne Fitzgerald and Professor Brian Fitzgerald entitled, “The use of Creative Commons licensing to enable open access to public sector information and publicly funded research results – an overview of recent Australian developments”. Keep your eyes peeled.

Thursday, November 6, 2008

CRC-SI Annual Conference 2008

This morning I attended the CRC for Spatial Information (CRC-SI) 2008 Annual Conference. The morning plenary was entitled, "Innovation in Australia" and was chaired by Peter Woodgate, CEO, CRC-SI. The session was opened (via video link) by Senator The Hon Kim Carr, Minister for Innovation, Industry, Science and Research, who noted the increasingly important role of spatial information and who expressed a desire to "restore public good as funding criteria" when the Australian Government is funding research and development.

In this session, I found Dr Nicholas Gruen's talk on Innovation in Australia especially interesting. My notes from his talk are below. They are a little rough - my apologies.

Dr Nicholas Gruen: Innovation in Australia

Information in the economy

What is the economy?

We used to think of the economy as “a thing which makes things”. But we now understand that there is more to economic policy than that. The economy is a “giant trading machine” – trade is important in our (new) concept of the economy. In economic policy reform over the last 30 years – including competition policy – trade is the basic theme.

But the economy is more than THAT.

It is also a “giant risk management machine” and a “giant information management machine”.

We have a mixed/hybrid economy – an ecology of public and private goods = markets are always this, they are not just private goods.

Firms compete according to standards, which are a public good (language, more, property rights and other laws, technical and trading standards); then firms compete in the private goods that fall within the gaps of the public goods.

“It is silly to talk of the internet as a private thing; it is not.”

Information is special – we need markets to harness distributed information and provide incentives.
Frederik Hayek – one of the more important things of a capitalist economy is its capacity to deal with distributed information

But markets dont handle information ideally either – Arrow, Akerlof, Stigliz – Information is a potential public good (reproduction is often costless) – best way for information to circulate in principle is for nothing (in cost) – standards are crucial to the passage of information (in ways that are much more integral than markets for trading for goods) – and standards themselves are a public good

Top down innovation in Government

We’ve been relatively good at it – e.g. secret ballot; HECS etc
We (here I think Nick is referring to the Innovation Review Panel as “we”?) recommended that we should further extend such innovative platforms – for instance HECS

Bottom up innovation in Government

This is the hard part
We looked at mechanisms to maximize the contribution of all levels of public sector innovation and also from the outside
Bottom up Innovation in the states (Vic) – e.g. Policy Idol – emerged from strategy workshop in the Premier’s Department – policy competition for junior officers – has been a very successful program

Then there’s government facilitating innovation elsewhere – the UK is pioneering various “challenge based” means of seeking to foster innovation

How to promote services innovations? -

The inadequacy of the tax concession

R&D tax concession works badly for services – to make it work you need to broaden the definition of R&D, then what happens is that firms in practice work out how to make their perfectly regular business activities fit within the new definition = not fair

Services innovation is often heavily regulated – finance, health, education – e.g. Rismark International

Permission to innovate?

regulation makes innovation difficult

We need innovation facilitation – we have major projects facilitation – we proposed something similar – Advocate for Government Innovation:
  • operate an Enterprise Challenge program
  • be a shopfront for “permission to innovate” processes
  • be bureaucratic champion for highly innovative firms and projects
  • help disseminate information about public sector information
  • provide resources to promote more flexible tendering

Innovation is often hard, but freeing up information is harder – e.g. Joshua Gans project to locate public toilets on iPhone – asked Department for permission to use information (which is available online) to make available on iPhone – Department said no because of contractual obligations; copyright issues etc.

The problem of serial veto – information has many hurdles to jump:
  • IP;
  • Contract;
  • permission hurdles;
  • “we see IP as a property law rather than some form of economic policy (like we now see competition policy)”;
  • compatibility of formats and systems,;
  • lawyers professional cultural of risk aversion and control maximization
Fragility in the face of serial veto - e.g. Patents v Open Source; open information v cultural of public service, legal profession and the media

Fragility amongst robust hazards – like trying to coordinating systems within houses: security alarm, lighting, sound, ventilation and air conditioning – we are still not very good at this, still seems like a “futuristic” concept

Thursday, October 30, 2008

New website: R18+ for Games

Electronic Frontiers Australia (EFA) has a new website: R18+ for Games. This is part of EFA's new campaign to support the introduction of a new classification for video and computer games in Australia. Movies can be classified R. They can contain R-rated content and still be sold, borrowed and watched legally. So why not games?

From the website:

Australia is the only Western country without an R rating for computer and video games. If a game is deemed unsuitable for MA15+ by the Office of Film and Literature Classification, it is refused classification and cannot be sold. Titles including 50 Cent, Bulletproof, Postal 2, Leisure Suit Larry, NARC, Singles, Blitz: The League, and Manhunt have all been refused classification in recent years. In 2008 alone, four game titles have been banned: Silent Hill, Fallout 3, Dark Sector and Shellshock 2.

According to recent surveys, the average age of gamers in Australia is around 30 years old.

An R18+ classification would require the unanimous support of all Attorneys-General, and in the past moves to change the current classification have been blocked on the vote of a single state Attorney-General.

EFA is now sponsoring a campaign to have the R18+ classification for games introduced in Australia.

If you support this cause, I encourage you to visit the site. EFA is asking supporters to write to their state Attorney-General and request an R18+ classification for games. Every letter helps!

I should also note that EFA has been a prominent voice against the government's current "clean feed" proposal. Yesterday, EFA Chair, Dale Clapperton, appeared on Channel 7's Morning Show to discuss the issue. You can read more and view the Morning Show clip on Dale's blog.

[Disclosure: My partner is on the EFA board]

Friday, October 17, 2008

Ice TV

Yesterday, the High Court began hearing the Ice TV appeal. This is a case that could potentially have fairly wide ramifications for copyright protection of data compilations, or none, depending on whether the High Court rules in line with precedent (Desktop Marketing).

For links to commentary around the case so far, see Peter Black's post here.

Ben Atkinson and Professor Brian Fitzgerald of the QUT Law Faculty yesterday posted a paper on QUT eprints, entitled, "Copyright as an Instrument of Information Flow and Dissemination: the case of ICE TV Pty Ltd v Nine Network Australia Pty Ltd". You can read the paper here.

Wednesday, October 15, 2008

Short Review: The Importance of Being Earnest

Last night I saw the Importance of Being Earnest at QPAC. It was fabulous and hilarious - the audience (including myself) was laughing throughout. The cast did a fantastic job of hamming it up in true Oscar Wilde style. And the set design and costumes were also amazing. Highly recommended. Catch it while it's still in Brisbane

Open Access Day

Today, 15 October 2008, is Open Access Day!

Today I attended an OA Day event in the QUT Library, which was sponsored by SPARC, PLoS and Students of Free Culture.

First, we watched the “Voices of Open Access” video, which is available on the Open Access Day website, and the QUT Library Secretariat “Shout Out” for OA video.

We then had some presentations and discussions, moderated by Elizabeth Stark.

Peter Jerram, CEO of PLoS, gave a short introduction. He stated that there is now:
  • Over 3600 journals in Directory of Open Access Journals (DOAJ)
  • Now more than 12,000 OA repositories in more than 70 countries
  • More than 50 mandates for OA in 28 countries
He also gave his thanks to:
  • Authors who choose to publish in OA
  • Peter Suber
  • Melissa Hagemann, OSI
  • DOAJ
  • Publishers and editors of OA journals
  • Research funders such a Wellcome Trust that provide funds for OA journals
  • SPARC and Students of Free Culture
  • Advocates of OA

Dr Phil Bourne, Editor in Chief of PLoS Computational Biology, who was presenting from University of California San Diego, gave the keynote presentation. The webcast can be accessed at http://openaccessday.org/program/

Presentation: The Promise of Open Access

mash up of academic content
e.g. Pubcast – video integrated with the full text of the paper – but this requires openness in relation to the paper i.e. unrestricted access, Creative Commons licence
e.g. Professional Profile includes all sorts of content: publications, pubcasts and videos etc – profiles are a first step to virtual research environments

BioLit: Tools for new modes of scientific dissemination
Mash up between database and journal article
Integrate biological literature and biological database and includes:
  • A database of journal text
  • Authorising tools to facilitate database storage of journal text
  • Tools to make static figures and table interactive
Semantic enrichment of text
Semantic enrichment at the point of authoring – like the spell checker in Word – scans for specific information/word (e.g. name of a gene) and goes out an retrieves information, info appears in column to side of paper, author can choose whether to link to that information or not.


Q: How does peer review fit into the new multi-media environment?
A: It is a misconception that peer review does not fit into the OA environment
For Pubcast, the paper associated with the video has already been peer reviewed.

Q: is there a plug-in for the semantic enrichment tool for open office or other platforms that are not Word?
A: Not yet, but probably coming. Will be open source and people can do what they like with it. No restraints imposed by Microsoft

ARROW Repository Day

On 14 October 2008, I attended the ARROW Repository Day held in Customs House in Brisbane. I presented on the legal issues surrounding management of data for inclusion in a repository. You can access my slides here.

Chris Rusbridge of the Digital Curation Centre in the UK also presented. Some brief notes from his talk are below. Chris was live blogging the day, so if you are interested I suggest you read his notes at the Digital Curation Blog.

Chris Rusbridge (Digital Curation Centre) – Moving the repository upstream

The resistant scholar
  • Uncertainty, risk - about copyright; about Ingelfinger Rule
  • Change
  • Too busy
  • Doesn’t fit into the way they do things now
  • Not well motivated by advantages to others
  • Little in it for them!

Research workflow
  • many different tasks in parallel
  • all different stages
  • teaching (several), research (several), writing up research, writing grant proposals, reviewing papers, administrative tasks etc

On negative clicks

Asked - how many extra clicks are you willing to make to ensure preservation of your record?

Answer - zero

Negative click repository?

Can the repository help rather than hinder?
Towards a Research Repository System? [diagram]

Maybe we could…
  • help with publisher liaison
  • support multiple authoring across several institutions
  • more permissive identity management
  • support multiple versions
  • fine grained access control
  • checkpointing
  • support supplementary data
  • provide basic data management capability
  • provide simple, cross-platform, persistent storage
  • provide some longevity
  • provide additional benefits

Thursday, October 9, 2008

More on the Brisbane Declaration

This is what Professor Arthur Sale of the University of Tasmania, one of the chief architects of the Brisbane Declaration, has written about it:

...May I tease out a few strands of the Brisbane Declaration for
readers of the list, as a person who was at the OAR Conference in

1. The Declaration was adopted on the voices at the Conference,
revised in line with comments, and then participants were asked to put
their names to it post-conference. It represents an overwhelming
consensus of the active members of the repository community in

2. The Conference wanted a succinct statement that could be used to
explain to senior university administrators, ministers, and the public
as to what Australia should do about making its research accessible.
It is not a policy, as it does not mention any of the exceptions and
legalisms that are inevitably needed in a formal policy.

3. The Conference wanted to support the two Australian Ministers with
responsibility for Innovation, Science and Health in their moves to
make open access mandatory for all Australian-funded research.

4. Note in passing that the Declaration is not restricted to
peer-reviewed articles, but looks forward to sharing of research data
and knowledge (in the humanities and arts).

5. At the same time, it was widely recognized that publishers' pdfs
("Versions of Record") were not the preferred version of an article to
hold in a repository, primarily because a pdf is a print-based concept
which loses a lot of convenience and information for harvesting, but
also in recognition of the formatting work of journal editors (which
should never change the essence of an article). The Declaration
explicitly make it clear that it is the final draft ("Accepted
Manuscript") which is preferred. The "Version of Record" remains the
citable object.

6. The Declaration also endorses author self-archiving of the final
draft at the time of acceptance, implying the ID/OA policy (Immediate
Deposit, OA when possible).

While the Brisbane Declaration is aimed squarely at Australian
research, I believe that it offers a model for other countries. It
does not talk in pieties, but in terms of action. It is capable of
implementation in one year throughout Australia. Point 1 is written so
as to include citizens from anywhere in the world, in the hope of
reciprocity. The only important thing missing is a timescale, and
that's because we believe Australia stands at a cusp..

What are the chances of a matching declaration in other countries?

Arthur Sale
University of Tasmania

This is what Peter Suber had to say on his blog:

This is not the first call for OA to publicly-funded research. But I particularly like the way it links that call to (1) OA repositories at universities, (2) national research monitoring programs, like the HERDC, and (3) the value of early deposits. Kudos to all involved.

Wednesday, October 8, 2008

Just announced: Brisbane Declaration [on open access in Australia]

Following the conference on Open Access and Research held in September in Australia, and hosted by Queensland University of Technology, the following statement was developed and has the endorsement of over sixty participants.

Brisbane Declaration

The participants recognise Open Access as a strategic enabling activity, on which research and inquiry will rely at international, national, university, group and individual levels.

Therefore the participants resolve the following as a summary of the basic strategies that Australia must adopt:
  1. Every citizen should have free open access to publicly funded research, data and knowledge.
  2. Every Australian university should have access to a digital repository to store its research outputs for this purpose.
  3. As a minimum, this repository should contain all materials reported in the Higher Education Research Data Collection (HERDC).
  4. The deposit of materials should take place as soon as possible, and in the case of published research articles should be of the author’s final draft at the time of acceptance so as to maximize open access to the material.

Brisbane, September, 2008

Tuesday, October 7, 2008

My presentations - September 2008 conferences

You can access my presentation at the Open Access and Research conference (Friday's workshop on legal issues) here, and my presentation at the eResearch Australiasia conference (Friday's workshop on eResearch in the Arts, Humanities and Cultural Heritage) here. (Both are in PDF)

Friday, October 3, 2008

ANDS Workshop at eResearch Australasia Conference

On Thursday 2 September, I attended the Australian National Data Service (ANDS) Workshop at the eResearch Australasia Conference 2008. This was a full day workshop, but the ANDS team did a great job of keeping the workshop interesting and highly interactive, and the day went very quickly.

In the morning, there were a few brief presentations – notably from Andrew Treloar of Monash University and the ANDS Establishment Project and Tracey Hinds from CSIRO. I particularly enjoyed Tracey’s presentation, which at a conference that seemed dominated by IT issues, focused on the social issues and the governance issues involved in data management and sharing research data. My notes from Tracey’s talk are below.

The rest of the day was spent in small round-table discussions. The most lively discussion surrounded questions about what institutions and research bodies need to help them in managing and sharing their data, and how ANDS could help. The group found that there was a need for:
  • an openly accessible registry of ontologies for metadata of datasets, so that institutions can start using common and enduring metadata to describe their data;
  • training for researchers, repository managers, research management staff, librarians, archivists and IT staff about data management (including the legal issues surrounding data management), database/repository infrastructure (how to make the database easy to use and sustainable), open access (why should you share your data?) and metadata. It was agreed that the training materials might have a generic introduction component that could be used by all groups, but then there should be different kinds of training materials that provide relevant detail to different groups (e.g. research management staff will have different concerns to IT staff; science researchers may have different concerns humanities researchers);
  • developing conventions for the citation of data, so that researchers can get credit for sharing their data; and
  • proper and comprehensive data management plans (DMP).

There was a consensus that data management plans were particularly important and that it would be useful to develop template DMPs which included specific sections that could be added or deleted as appropriate (for example, a section about compliance with privacy laws might be relevant to medical research but not to astronomy research). It was also thought that ANDS could select a few research projects from different disciplines and assist these projects in formulating a DMP. The resulting DMPs could then be made available online for other projects to use and adapt.

In relation to ANDS selecting particular projects to assist, in a broader way, with their data management and release (“engagement targets”) in the hope that these projects might then appear as “exemplar projects” for other groups, it was considered that appropriate selection criteria might be:
  • broadness of audience and impact;
  • potential for reuse of data and the ongoing reusability/sustainability of the data;
  • the project’s willingness to assist others to develop their data management skills;
  • wide inter-disciplinary appeal;
  • willingness to transfer data around; and
  • projects which will have good exemplary value to attract other communities.

I believe that ANDS will make the notes taken from the workshop available online.

Here are my notes from Tracey’s talk:

Tracey Hind – CSIRO
  • ownership of data should stay with researcher
  • but still need to manage CSIRO’s data at a higher level – maybe provide an “enabling” service for this rather than dictate a “one size fits all” approach
  • As of now, CSIRO still does not formally recognise the idea of data management
  • Real challenges are not technology – it is the human factors – issues of acceptance, understanding, people being prepared to share their data, IP etc
  • High demand for storage, but storage is not management
  • Scientists are not working as well across disciplines as the Flagship vision as hoped, much of this is because “you don’t know what you don’t know” – and it’s hard getting insight into other research disciplines
  • Making data easily discoverable is the key to achieving multi-disciplinary outcomes
  • Lesson is that data is a complex issue – especially when researchers don’t understand the potential benefits – you need exemplar projects to demonstrate the benefits of data management to get buy in.
  • CSIRO’s data management vision (eSIM) – CSIRO scientists will be able to…gather, analyse and share scientific information securely and efficiently, leading to greater scientific outcomes for Australia
  • Four layers – people, processes, technology and governance
  • People challenges = incentives for deposit into a repository;
  • Processes challenges = making sure that the work flows created actually support the technology and make things easy
  • Governance = making sure all of this is properly funded and that data management is a part of the decision making (i.e. make sure researchers have a DMP before they are awarded funding)
  • CSIRO’s exemplar projects = Auscope project; Atlas of Living Australia; Corporate Communications

Tuesday, September 30, 2008

eResearch Australasia Conference 2008 - Tuesday morning (30 September)

John Wilbanks – Uncommon Knowledge and e-Research

Once again, John Wilbanks gave an informative and dynamic presentation. It was geared towards the audience in attendance here at the eResearch Australasia Conference (who are somewhat more IT and science focused than the audience at the OAR conference last week) and so described in detail many aspects of the NeuroCommons Project. If you are interested, I suggest that you see the Neurocommons website. I don’t think any summary that I could provide here would do the project justice. But here are some notes from the beginning of John’s presentation:

Why “eResearch”?

1. eResearch is a requirement imposed on us by the flood of data
  • the web doesn’t give us the same results for science as it does for culture
  • so what can we do?
  • We can…collaborate
  • Eg - Watson and Crick – their success was composed, by building on a series of blocks of knowledge that were available to them from a range of sources
  • But humans can’t build models to scale anymore
  • We need to utilize digital resources
One way to think about eResearch is that it is about:
  • Finding the right collaborator;
  • making big discoveries;
  • getting credit for one’s work
2. We need to convert what we know into digital formats that support model buildings
  • “the web” – no organising topics – hyperlinking allows us to organise things in a dynamic way
  • all the data and all the ides: building blocks
  • open access attempts to solve the legal problems – giving credit where credit is dues; allows humans to read the papers; allows publicly funded research to be accessed by the public
  • but it doesn’t solve the technical problem of paper-based formats that cannot be read by machines
  • we need to develop machine-searchable formats

Kerstin Lehnert, Columbia University – New Science Communities for Cyberinfrastructure: The Example of Geochemistry

Kerstin described eResearch as a vision to provide a genuine infrastructure of highly reliable, widely accessible ICT capabilities to assist researchers in their work – ultimately about people

She discussed the cultural issues involved in sharing data. She identified data citation (what I would call “attribution”) as a big problem. How can all scientists and contributors be cited? Many want to be attributed personally (not just by a project), but there are so many contributors and this quickly becomes a big and messy problem. This observation reflects the problem that we at the OAK Law and Legal Framework to eResearch Projects identified in assessing whether Creative Commons licences could be applied to data compilations. Attribution is an important condition of the CC licence. Researchers and research projects need to decide and identify (before applying a CC licence) how the data compilation is to be attributed, otherwise users could run into all sorts of problems and confusion.

Jane Hunter (UQ) - National Committee for Data in Science (NCDS)

A committee of the Australian Academy of Science – established in February 2008; member of CODATA

Mission – to promote enduring access to Australia’s scientific data assets in order to drive national research and innovation
And to provide a National Data Science voice
Encourage and facilitation cross-fertilisations, between specific science disciplines and other data generation/management disciplines

Future activities include engaging with Chairs of other national committees, including looking at what role they can play within ANDS (Australian National Data Service) to support their goals.

Review: Anatomy Titus Fall of Rome

On Thursday 25 September, I saw The Bell Shakespeare Company’s production, “Anatomy Titus Fall of Rome” at the Cremorne Theatre. The play was directed by Michael Gow and starred John Bell as Titus Andronicus.

I was very impressed with this production. It was contemporary (all actors performed in regular clothes and sometimes wore rather absurd masks) and powerful. I wasn’t quite sure how they were going to depict what is probably Shakespeare’s bloodiest tragedy, and in the end they did it with a lot of blood – a bucket of “blood” centre-stage, to be exact, which the actors flung all over the stage during the course of the production.

The actors did a wonderful job and carried the audience through the entire 2.5 hours without pause and without a hitch. The intermingling of comedy throughout the tragedy certainly helped.

The parts I liked best were where modern objects and references were weaved amongst the Shakespearian ones – books (I think all were actually copies of Shakespeare’s works) were used as weapons and the actor’s monologues frequently featured random modern words thrown in as if to keep the audience on their toes.

However my favourite part was after the play itself, when the actors took some time to talk directly with the audience. This was a wonderful thing for them to do and it resulted in some very interesting discussion. Importantly, we discussed why a play that featured a prominent black character and the violent raping and torturing of a young woman was performed entirely by a white male cast. Several female members of the audience expressed the feeling that they would not have been able to watch the rape scene had it been performed with a female actor, and were consequently glad that a man had played the part. I actually thought the absence of both a dark-skinned actor and a female actor only served to vividly (and almost shockingly) reveal to the audience the racist and sexist undertones in Titus Andronicus, and indeed, in much of the world still today. I was impressed with the way the cast discussed these issues with the audience– they proved to be intelligent and sensitive to the issues. (However, it did not change the fact that the actors could only ever act out their interpretation, as a white male, of what it was like to be a woman or a black man.)

I would highly recommended seeing this production before it closes on 4 October.

Monday, September 29, 2008

eResearch Australasia Conference 2008 - Cloud Computing

Monday – Plenary: Cloud Infrastructure Services Panel Session

Chair: Nick Tate, UQ
Tony Hey – Microsoft Research
Peter Elford – Cisco
Kevin Mayo – Sun Microsystems
Anne Fitzgerald – QUT

Tony – A Digital Data Deluge in Research

- outsourcing of IT infrastructure
- minimize costs
- small businesses have access to large scale resources
- eg – Virtual Research Environment run by British Library: content management; knowledge management; social networking; online collaboration tools
[similar presentation to at OAR conference]

Peter –

- is cloud computing really a new idea?
- don’t think so – still just software as a service
- so what is the “cloud”?
- do researchers struggle to get access to machines? – probably no
- but do they have problems managing them well – probably yes
- balance between technology, people and processes
- it is a natural evolution and another opportunity
- but not a disruptive technology

Kevin –

From point of view of building these systems:
- need a successful business model
- need to consider privacy and security in a global world
- need to understand technical considerations
- there are a number of services out there at the moment because they have managed to deal with the business model problems….
- …but they may not have effectively dealt with the other issues
- e.g. how you get your data to and from the service
- in the future – we might see: automating the collection and analysis of census data; climate data etc – with barely any interference by people

Anne –
- when we think of cloud computing, many legal issues come to mind: privacy, data security etc
- so far, adapting the law to the digital environment has developed in a very ad hoc manner
- so maybe we would be better to approach it from principles, I prose the following principles:

1. establishing trust in the online environment
- cloud computing = applications that can be accessed anywhere by anyone
- so issues of data security, privacy, reliability of the data and the service
- not much on this (beyond some privacy restrictions) in Australia at the moment

2. equivalence of traditional and online transactions
- need a set of rules to apply to online activities that are equivalent to traditional activities
- at the moment, attempt to transpose current laws in online environment = copyright, electronic transactions act
- but when we look at cloud computing we see this principle is not being applied in a consistent way
- need for clarification of concepts of ownership of data stored on someone else’s equipment
- vast difference between copyright licence given to Google for Google Docs – vs rights that would be given to someone in the real world who is storing and managing someone else’s documents (i.e. they would be given virtually no rights) – why the immense difference just because the storage and management occurs online?

3. Participation of Government in regulating online activities
- would enactment of legislation help or hinder here?

4. We need openness in this environment
- open standards and maybe also open source
- affordability of cloud computing can help to overcome the digital divide
- expectation of users is that they can access the service where and when they like

Development of laws and policies in this environment has occurred primarily at an international level (e.g. OECD – Seoul Declaration), but there is still no international body charged with regulating online commerce


Q: Ashley Buckle – Monash: not convinced that this is a solution for him running a small research lab – this is the problem: convincing people that this is for them, especially when they don’t want to be guinea pigs for new projects that may not work

A: Tony – you can only be convinced by something that works for you. There will be a variety of academic cloud services. But the real test is that it is easy to use, can be acquired easily and cheaply, and it should work for you and if it doesn’t work then you shouldn’t use it.

Q: If Microsoft and Google etc operate cloud computing services outside of the USA, does the Patriot Act still apply to them?

A: Not an expert on Patriot Act, but - we need to establish a uniformity or conformity throughout the world, after discussion among countries, and not just have one country’s law dominate, otherwise this could actual be a barrier to trade etc.

eResearch Australasia Conference 2008

I am currently in Melbourne for the week, attending the eResearch Australiasia Conference 2008, hosted by the Australian Government Department of Innovation, Industry, Science and Research (DIISR) at the Sebel and Citigate Hotels, Albert Park. The conference runs from Monday 29 September - Wednesday 1 October, then there are two days of workshops on Thursday 2 and Friday 3 October. I will be here until Friday. I will try to blog my notes as I go (subject to internet availability) and I will post my overall comments at the end.

Sunday, September 28, 2008

OAR conference notes - Andrew Treloar

Dr Andrew Treloar – ANDS Establishment Project

Blue print for ANDS = Towards the Australian Data Commons (TADC) – developed during 2007 by ANDS Technical Working Group

TADC: Why data? Why now? – increasing data-intensive research; almost all data is now born digital; “Consequently, increasingly effort and therefore funding will necessarily be diverted to data and data management over time”

TADC: Role of data federations – with more data online, more can be done; increasing focus on cross-disciplinary science

Changing Data, Changing Research – e.g. Hubble data has to be released 6 months after creation

ANDS Goal = to deliver greater access, easier and more effective data use and reuse

ANDS Implementation assumptions:
  • ANDS doesn’t have enough money to fund storage, and so is predicated on institutionally supported solutions
  • Not all data shared by ANDS will be open
  • ANDS aims to leverage existing activity, and coordinate/fund new activity
  • ANDS will only start to build the Australian Data Commons
  • ANDS governance and management arrangements are sized for the current funding
Realising the goal – need to:
  • Seed the commons by connecting existing stores
  • Increase (human) capability across the sector in data management and integration

ANDS structure = four programs:
  1. Developing Frameworks (Monash) - about policies, national understandings of data management, and research intensive organisations = assisting OA by encouraging moves in favour of discipline-acceptable default data sharing practices
  2. Providing Utilities (ANU) – Services Roadmap, national discovery service, collection registry, persistent identifier minting and management = assisting OA by improving discoverability particularly across disciplines (ISO2146)
  3. Seeding the Commons (Monash) – recruit data into the research data commons = assisting OA by increasing the amount of content available, much of it (hopefully) OA
  4. Building Capabilities (ANU) – improving human capability for research data management and research access to data – esp. early career researchers teaching them good data management practices from the beginning = assisting OA by advocating to researchers for changed practices

OAR conference notes - government bodies

Jenine Borowik – Australian Bureau of Statistics (ABS)

What stimulates particular disciplines to adopt OA when others do not?

This question is particularly pertinent to the ABS – ABS has a mission of promoting informed decision making – but there is an increasingly array of “national interests” – as a result, ABS has realised that we cannot continue to be an island of research and information gathering and dissemination, we need to work with other organisations. Due to this, interest in encouraging a community of organisations to build a rich statistical picture of Australia.

In 2005, ABS removed the barrier of price to access for information. So anyone who accessed ABS website could freely download publications etc. Number of downloads has risen from 1 million per year to 5 million per year. Page views from 50 million to 150 million.

Creative Commons (CC) gives a solution to another barrier – the legal barrier. ABS is interested in using CC. Would like to use something that is successful and widely understood rather than something they have developed that is “just theirs”. Also interested in the way the licences are carried with the particular item of data, and the requirement for attribution. Legal aspects not the primary consideration for ABS, so if there is a mechanism that makes it easy to apply the right licences then that is a good thing.

Jeffrey Kingwell - Geosciences Australia (GA)

GA is a national geographic information clearing house. Collects seismic info, operates national mapping agency etc.

Mission = collect geographic stuff to give to other people to do stuff with.

So why is it so difficult to get the stuff out there?

Finding that due to a number of factors, including IP law and IP government policy, that it is important to align OA policy with IP policy. This is an issue where policies developed in different departments (e.g. IP policy by commercialization unit, OA in another area). GA is trying to construct an IP policy that is consistent with their vision and core function.

Creative Commons Pilot Project 2007-08

  1. Have a simple statement of your objective in sharing
  2. align IP policy with that
  3. use simple tools (such as CC) to implement

Dr Alexander Cooke – Australian Research Council (ARC)

Broad principles for an Accessibility Framework:
  • Publicly funded research outputs and data should be managed in ways that maximise public benefit;
  • Institutions or individuals receiving public funding have a responsibility to make the results of that funding publicly available
  • ….
What opportunities are there?
The Accessibility Framework offers the ARC and NHMRC (National Health and Medical Research Council) the possibility of strengthening their funding rules to mandate rather than encourage deposit

OAR conference notes - Maarten Wilbers

Session Six: A Legal Framework Supporting Open Access

Maarten Wilbers – Deputy Legal Counsel, CERN

Large Hadron Collider (LHC) – switched on 10 September

SCOAP = Sponsoring Consortium for Open Access Publishing in particle physics

Fundamental research mandate in particle physics – in a good place to move to full OA publishing of their scientific data and publications – this might be the “tipping point” for scientists in other disciplines

CERN founded in early 50s – OA in high energy physics was “in the cards” from the beginning…because OA is so logical

If you walk around CERN you can see the enormous tools constructed from public funds to help scientists gain greater understanding of small particles – the case for OA can almost be made without a word being spoken

OA in publishing is the future

CERN’s 1954 Convention has laid the foundation for a culture of openness in the dissemination of the organisations scientific work: CERN must perform fundamental research for non-military purpose and make the results of its work generally available

This requirement of openness has helped in the shaping of a string of sequential milestones:
  • Scientific collaboration across national (and political) boundaries;
  • Preprint culture and peer review;
  • World Wide Web;
  • Computing Grid and Open Source software;
  • And most recently: promotion of OA publishing.

The legal frameworks governing these activities are supportive rather than restrictive in nature and adapted to collaboration involving multiple participants. Legal issues mostly concern copyright and are generally uncontroversial.

OA is a logical application of the web.

SCOAP aims to convert high quality particle physics journals to OA

Scientific experiments at CERN reflect CERN’s requirement of openness

Collaboration usually laid down in MOU - IPR vested in creating party, wide licensing between all parties involved

Publication of CERN’s work: particle physics pioneered the pre-print culture in the 1950s, scientific manuscripts circulated between scientists for peer review before publication

Main milestone was the creation of the World Wide Web at CERN by Tim Berners Lee

1992 – CERN released the WWW software in the public domain – “CERN relinquishes all intellectual property rights to this code, both source and binary form and permission is granted for anyone to use, duplicate, modify and redistribute it”

Why OA (from CERN’s perspective)?
  • High quality journals, offering peer-review, are the [High Energy Physics] HEP’s community’s “interface with officialdom”;
  • Depending on definition of HEP, between 5000 and 7000 HEP articles published each year, 80% in 6 leading journals by 4 publishers
  • Subscription prices make the current model unsustainable. Change is required
  • HEP is a global undertaking and OA solutions should reflect this.

CERN’s potential solutions for OA publishing:
  • Articles free to be read for all
  • Tender process will result in price of article; linked to quality
  • ….
Legal issues – keep things as simple as possible!

A strong example if OA publishing – the design of LHC published in OA journal (Journal of Instrumentation..?) just recently

OAR conference notes - Tony Hey

Tony Hey – Cloud Computing

Rationale for Cloud computing
  • Outsourcing IT infrastructure
  • Minimize costs
  • Large cloud/utility computing provides can have relativel very small ownership and operation costs due to the huge scale of deployment and automation
  • Small business have access to large scale resources

Example – Amazon Web Services
= Simple Storage Service (s3) – storage for the internet; simple web service interface

Example – smugmug.com
= Profitable, debt-free company because it does not have any hardware resources; it only uses Amazon hardware (for free, in the cloud)

Examples from Microsoft:

Live Mesh
  • A PC in the cloud
  • Can synchronize PC in the cloud with your laptop, your mobile devices such as phones or music players etc
Office Live Workspace
  • Can upload documents for other people to work on
  • Other people can download and use those documents that you choose to share

The future = software plus services for science

Expect scientific research environments to follow similar trends to the commercial sector

Example - Trident Scientific Workflow Workbench

Toward a Smart Cyberinfrastructure

Collective intelligence

Example – last fm.

A world where all data is linked…
…and stored/processed/analyzed in the cloud

OAR conference notes - Richard Jefferson

Richard Jefferson – Opening the innovation ecology
  • Public good is not an abstract
Yochai Benkler Stack: Physical-Code-Content-Knowledge

We should ask the question: if we are successful in that everything is made OA – what then? We must make sure that the knowledge we generate will enable people to act on this knowledge and use it for benefit

The post-Yochai Benkler Stack = Physical-Code-Content-Knowledge; Capability to Act

We now have a system that is so opaque and has embedded in it intrinsic “inpermissibility” that it is not useful and capability to act on it is restrained

CAMBIA – focused on innovation system reform

BiOS Initiative – launched early 2005 with an article in Nature, biology open source (biological innovation for open society);

Patent system – actually a system based on open disclosure
This is not about rhetoric – it is about the practical goal of efficiency

OS – open source; open science; open society (need inclusiveness)

Used example of “golden rice” – which was once “poster child” of biological engineering - development of rice for third world areas where there was vitamin A deficiency in food so children were going blind, but the result used so many different products and processes that were patented that eventually the golden rice was not able to go ahead

Patent Lens – develop harmonized structure and infrastructure for searching patents; embedded metadata about patents; web 2.0 quality decision support about patents;

Efficiency = minimise tainting of product from incorporating other people’s IP (usually unknowingly) and maximise capacity for adoption – can try to do this by improving people’s knowledge about what IP is incorporate and enhance decision-maker’s ability to make good decisions for public good

Persistent, pervasive, jurisdiction agnostic activity = platform for community collaboration and transparency

Proper parsing, visualization and decision-making

Initiative for Open Innovation – increasing the equity, efficiency and effectiveness of science-enabled innovation for public good

Defining open innovation:
Open = transparent
Open = inclusive

Web based tools for scientists funding agencies, public sector and innovation enterprises to mine the patent world

Build patent lens into Nature and PLoS biology – to show, where readers are reading an article about a particular invention, whether the author has filed a patent on this

OAR conference notes - Alma Swan

Alma Swan – Open Access: The Next Five Years

Where we are now:
  • Focus = research articles
  • Latest estimates show level of OA for research article is still <20%
  • Expect even more attempts by (some) publishers at obstruction:
  • Arguments often fallacious – best way to deal with them is calmly and rational
  • Arguments sometimes dishonest
  • Argument always wrong to argue that publicly funded research carried out by public researchers should not be made publicly available because it would hurt a private/commercial player
  • Weapon: copyright
  • Wield it, now, against the interest of academic and the paying public
  • Reason for the panic: OA mandates

Open Access policies:
  • a lot of almost-there well-meaning policies
  • come in various flavours; not all taste good to everyone
  • NIH
  • But we are on an upward trend
  • Mandates work; voluntary policies do not
  • Because the outcome makes glorious sense for the research institutions and funders
  • Repositories are also management tools
  • And marketing tools for a university
  • Helps the university make the best use of the web

Repositories: state of play
  • growing at a rate of around 1 per day
  • Alma cannot believe that within 5 years there will not be a serious university that does not have a repository and does not actively use it

  • It is a completely resolvable issues
  • Yet it is the major barrier to simple acceptance and practice of OA by researchers
  • Copyright futures – actually a tendency towards the legal strengthening of copyright in general
  • Research community practice will demonstrate the way copyright is applied to scholarly articles is out of date
  • Author agreements that retain copyright (licence to publish)
  • New ‘liberal’ practices with respect to publishing findings
  • Anyway, OA is completely compatible with copyright

New, ill-defined issue: research data
  • increasingly the primary output in some fields
  • data have yet to be properly recognised as research output
  • increasingly the subject of mandates, too

New research approaches…
  • …depend on OA
  • e-research (big research) – collaborative research – needs OA to make it work properly
  • but so does collaborative ‘small’ research
  • interdisciplinary research
  • web 2.0 outputs becoming a norm
  • early examples of institutional solutions – institutions have to start to help things happen – VIVO: Virtual Life Sciences at Cornell (a system that links up within the uni: the repository, the library, personal websites of academics etc);

Pragmatic Solutions:
  • joining articles, data and other related outputs in better ways
  • more (and more) work on standards
  • ‘surfacing’ web content – i.e. better way to show off OA content
  • new services built across repository networks
  • clearer vision of how to reach a repository-based scholarly communication system
  • new technologies need to show content in a form that researchers (and machines) can exploit (XML) – needs to be semantic/exploitative technologies
  • there are already publishers who use a repository as a means of submitting the paper to the publisher for peer review

Wrong solutions: impact and assessment:
  • for too long we’ve used a proxy measure to measure impact (journal impact factor), but for years it has been use to advance (or retard) careers
  • with an OA corpus, multiple metrics and indicators are possible
  • e.g. in the health sciences in the UK, move to measure impact by where it leads in terms of new medicine, new treatment NHS spending etc, not just the journal where the article is published

Mahatma Gandhi:
First they ignore you
Then they laugh at you
Then they fight you
Then you win!

Everything “open” started as a big joke. But things are changing….

It’s been too easy to dismiss the issue:
  • institutions have been notably disengage
  • scholarly communication has been low on the agenda
  • yet it is central to the core mission of a university

Questions universities will be addressing:
  • Are we happy with current quality and impact measures?
  • What do we want?
  • What new reward systems can we build?
  • How can we use the internet better?

Commentators: Prof Tom Cochrane (QUT) and Derek Whitehead (Swinburne)

Prof Tom Cochrane


Mandates are only likely to succeed if they are clearly purposed in terms of scope – there must be clarity about what outputs the mandates will catch, where the outputs will be and for what purpose, and clarity at a policy level about whether it is in itself sufficient to make a rule (mandate) – at QUT it was thought not to be enough, that it had to be implemented cleverly, which is where the library came in in developing the repository properly


We need to look at the system of rewards – until we do something about incentives for data curation, then they wont happen or will happen accidentally and haphazardly


A large number of people are rendered more uncertain about copyright than about anything else. Copyright must be dealt with in this space – we need clarity about it as an enabler not an obstacle


One trend that is contradicting the nature of research, is that the semantic web tools are forcing questions about how collaboration is to be managed. There rush to develop tools where management is at a machine level rather than a human level. But unless we solve some of the legal and regulatory issues that are thrown up by the use of these tools then we will keep being hindered in our OA efforts.

OAR conference notes - John Wilbanks

John Wilbanks (of Science Commons) – The Future of Knowledge

Knowledge is a set of building blocks – value is not that much until you start to put it together with other ideas and knowledge

Ideas and knowledge want to be connected

2 futures – we get to choose which we build – (1) only the people who have money have access to the knowledge (2) one in which there is an open network

(1) Knowledge brings revolutions

The past of knowledge = “Human-scale knowledge” – the scholarly canon (journals) – knowledge was human-organised and human-structures
How did this knowledge bring a revolution?

Moving to a world where knowledge acquisition is faster, smaller, cheaper and more robotic. Moving from a world where humans generate the scale of knowledge to a world where machines generate the scale

We have an implicit network that is already there for knowledge, but because we are generating it so quickly and on such a large scales, we are coming up against barriers - legal (copyright, DRM), technical (still use paper based formats online that cannot be searched by machines – i.e. PDF), business (publishers make money from closed access and we don’t yet know how they can make money or build business models around open access), social (scientists still get rewarded for being closed) - that we never encountered before

Over-atomised knowledge – smaller and smaller questions – primary output is a paper – John argues that these are not the primary vehicles for knowledge in a digital world

Incremental advances via technology – no big risks to achieve great advances anymore because you don’t get rewarded for making these risks, in fact you come up against huge legal barriers that prevent you using other research to take these risks

(2) We need to make systemic changes that connect knowledge

e.g. “the commons” – a number of different meanings: (1) land we hold in common e.g. public footpath; right to do research – rights of way across private property; (2) no copyright – things we all own

we are coming from a world where it was hard to be a creator and disseminate your work. We are not in that world anymore. There is now a disconnect between the copyright laws that Disney wants and the copyright laws that we as individual creators want. This is where the commons can make a systemic change.

Systemic change about the way we think about how we share knowledge – not just paper-based formats in a digital form – forces us to use technologies that are immediately outdated – what kinds of technology can we used instead? – a network of devices (layers: physical; code; content – there has been many developments of openness in these layers, but we have also seen an imposition of control in these layers (copyright)) – do we need new layers? Knowledge layers; graph layers etc. Info atomization kind of forces our hand to do this. Knowledge accessed needs to support the questions being answered (eg – when you type a query into Google – it tells you to read thousands of papers – this is not the ideal answer)

Copyright is incompatible with ideas connecting to each other.

(3) The disruptive force of connected knowledge

“guild” culture (as in historical sense of guilds, where the crown put limits on people not in the guild from weaving etc)

the way we do science actively discriminates against crowds and the wisdom of crowds

knowledge can be democratized: programming; creativity; buying and selling
it is easy, cheap and free

there are no office superstores for science; there are no internet marketplaces for science…but they are coming

destroying a guild culture of knowledge…what will come after it?

Creating a network culture for knowledge

• are we going to “watch” the knowledge like tv, or do something with it? – in the future of knowledge, we should do stuff with our knowledge rather than just consume it

Commentators: Dr Terry Cutler and Prof Mary O’Kane

Dr Cutler –

proud of the focus in Innovation Review on open access; however, first an apology and explanation – there is a difference between web version and print version – both supposed to be released under CC but were not (copyright assertion for Dr Cutler instead) – now attempting to have this rectified for the web version.

Key assertions from the report = about investment in people; global integration; flows of information and the freedoms to innovate

2% challenge of Australia – at best, we have a 2% share of global knowledge generation, and we don’t pay enough attention to the other 98% and how we access this – as a country we will always have an interest in an open network because we derive the most benefit from it

flows of information = communications. Communications theory and legal principles around communications were always based on connectivity. Open access is really just an extension of these principles.

Challenge – who really “owns” this problem of driving solutions (particularly at a government level)? – we need the government to address accessibility issues and articulate a national innovation policy – someone needs to take responsibility for this at the centre of government

Too much emphasis on “protectable” knowledge and not enough on informal networks and social networks that underpins the generation of an innovative community – need to open up access to that tacit knowledge and put social networks back into science and technology

Professor Mary O’Kane –

(1) is the future that John is talking about possible? How do we get to participatory science?

Can Australia lead this move into a participatory culture? We need to change the incentives for scientists. We need to change the social culture and drivers generally. So what are the drivers? Usually the intrinsic values are strongest (i.e. solving problems) not money. So how can we celebrate these intrinsic values? Across the university sector we need to reward people for open publishing.

(2) Issues that arise if you start to get the participatory culture going?

Problems that arise when you use the networks that have been built automatically, is that it is very hard to “probe the node” and know what is in the network. But does the human need to know or can we leave this to the machine? Do we need to know the knowledge? And at what level?


[John: we need to lower the cost of failure to increase the rate of innovation (i.e. in the context of start-ups)]

(1) Richard Jefferson: the power of the guild is building value, trust and quality control and we shouldn’t erode that

John (response): we don’t need to get rid of guild completely, but we need to build another layer where we can build on the knowledge of everyone – but we can still have trademarks etc to control quality

Mary (response): I’ve always wondered why we don’t use the internet more for structured, controlled discussion about things – there is no reason why we couldn’t and that would also help control quality – by generating discussion

(2) Roger Clarke – referring to the “tacit knowledge problem” seems to assume that the way the human mind works can be reduced to a computer-based system and the problem is that the mind does have a generic model that we can all grasp but we just haven’t transferred it over to the computer yet. But everyone thinks differently.

John (response): I don’t think we can actually encode how the mind works, but we need to make information available. That is the importance of openness – you need to be able to read, criticize and comment on what I put up, and that is how we see the reflection of the many different minds at work. Getting it into the computer means we can start accessing that information and competing on it using our brains rather than competing on our access to computers.