KMi Seminars
Better Science Through Benchmarking: Lessons for Software Engineering
This event took place on Friday 04 June 2004 at 14:30

 
Susan Elliott Sim

Benchmarking has been used to compare the performance of a variety of technologies, including computer systems, information retrieval systems, and database management systems. In these and other research areas, benchmarking has caused the science to leap forward. Until now, research disciplines have enjoyed these benefits without a good understanding of how they were achieved. In this talk, I present a process model and a theory of benchmarking to account for these effects. These were developed by examining case histories of existing benchmarks and my own experience with community-wide tool evaluations in software reverse engineering. According to the theory, the tight relationship between a benchmark and the scientific paradigm of a discipline is responsible for the leap forward. A benchmark operationalizes a scientific paradigm; it takes an abstract concept and turns it into a guide for research. Application of this theory will be illustrated using an example from reverse engineering: the C++ Extractor Test Suite (CppETS), a benchmark for comparing fact extractors for the C++ programming language. This talk will conclude with a discussion of how insights from studying benchmarking can improve the science in software engineering and collaboration in scientific communities more broadly.

(No replay available due to a shortage of technical staff to record event on the day)

 
KMi Seminars Event | SSSW 2013, The 10th Summer School on Ontology Engineering and the Semantic Web Journal | 25 years of knowledge acquisition
 

Future Internet is...


Future Internet
With over a billion users, today's Internet is arguably the most successful human artifact ever created. The Internet's physical infrastructure, software, and content now play an integral part of the lives of everyone on the planet, whether they interact with it directly or not. Now nearing its fifth decade, the Internet has shown remarkable resilience and flexibility in the face of ever increasing numbers of users, data volume, and changing usage patterns, but faces growing challenges in meetings the needs of our knowledge society. Globally, many major initiatives are underway to address the need for more scientific research, physical infrastructure investment, better education, and better utilisation of the Internet. Within Japan, USA and Europe major new initiatives have begun in the area.

To succeed the Future Internet will need to address a number of cross-cutting challenges including:

  • Scalability in the face of peer-to-peer traffic, decentralisation, and increased openness

  • Trust when government, medical, financial, personal data are increasingly trusted to the cloud, and middleware will increasingly use dynamic service selection

  • Interoperability of semantic data and metadata, and of services which will be dynamically orchestrated

  • Pervasive usability for users of mobile devices, different languages, cultures and physical abilities

  • Mobility for users who expect a seamless experience across spaces, devices, and velocities