Keynote and Tutorial Speakers
Simulators Used to Enhance the Safety and Efficiency of Nuclear Power Generation
Janos Sebestyen Janosy
MTA KFKI Atomic Energy Research Institute
Head of Simulator Development Department
P. O. Box 49, H-1525 Budapest Hungary
Cheap an reliable power sources are essential for the further technical development. However, fossil energy is regarded as dangerous for our environment because the unavoidable CO2 releases. There are technologies already capable to capture CO2 instead of releasing it, but they are barely affordable for the developing nations - those who are in the biggest need for new energy resources to accomplish their industrialisation.
There was a hope that clean and abundant new form of nuclear energy - that means the fusion-based nuclear power plants - will solve our energy problem forever. Due the enormous amount of difficult and unsolved technical problems it became clear that fusion power is not going to be available - not at least for several decades.
On the other hand, the growing oil prices turned the bio-energy production economically profitable. However, the fields producing sugar cane and corn for conversion to fuel for our cars are missing from the food production. The population on the earth is growing exponentially - we are at least three times more than we were 60 years ago. The "green revolution" - that means intensification of the food production using fertilizers and pesticides - was able to keep pace with this growth so far but probably it is going to slow down nowadays - less and less new soils are available for intensification of the food the production. Emerging lifestyle of developing nations - trying to eat more meat and less vegetables - is turning the existing fragile balance of food production and consumption even worse.
For the time being it looks like that the existing fission-based nuclear technologies are able to fill the gap of the missing energy production.
There are several risk-sensitive industries - like air traffic, nuclear power generation, dangerous chemical processes, etc. The technical solutions used are usually well approved and widely known; the occasional problems usually originate from the not-careful-enough design, the insufficient risk assessment, the unsatisfactory training and mismanagement.
With proper operation and waste management the nuclear fission power is one of the clearest and cheapest energy sources: no gases are emitted during the energy generation and other related preparatory etc. processes. The upcoming nuclear-fusion-based power plants are even more promising; all contamination in these plants will be decayed practically to nil in less than 100 years - but these technologies are still a little far away. Renewable energy sources can play an important, but only a supplementary role, at least for the foreseeable future.
Due to historical reasons, the public approval of the fission-based nuclear power is rather low in many countries. On the other hand, we still have to wait for the appearance of significant fusion power at least several decades; and closing this gap the construction of a new generation of NPPs (nuclear power plants) seems to be unavoidable. Construction on the large scale of carbon dioxide emitting conventional power plants operating on fossil fuel seems to be the worst solution, anyway.
Even ignoring the CO2-related problems, the growing energy needs of the developing Asian countries cannot be satisfied alone with the oil or gas available on the markets. Therefore there are several NPPs already under construction and more contracts are to come. Meanwhile, all over in the USA and Europe the operation of old NPPs are going to be prolonged for another 20-30 years. Slowly, even in Europe the construction of new nuclear power plants are considered, too. The first such "Generation 3+" NPP is already under construction in Finland.
The biggest problem of the nuclear industry is that the still-stand lasted too long - there were very few new contracts during the last decades and now we lack experts and production capacity. Unfortunately, this situation cannot be improved just in few years. That is why the life-time prolongation of the existing nuclear power plants became such a hot topic all over the world.
In all cases, simulation studies and the use of simulators is essential. It is a well known fact and it is widely approved by many scientists and engineers that direct evaluation of different technical designs above a certain complexity level is unthinkable.
Careful modelling, model integration, verification and validation is necessary to build the simulation tools and computer codes for the design and real-time simulators are even better for testing and validation of complex industrial processes.
The average lifetime of a big nuclear or other power plant exceeds that of its instrumentation and control (I&C) systems several times. Computer based such systems are prone to even faster "moral" exhaustion. Without extensive simulation the replacement of such systems would cause long-lasting outages resulting in great financial losses.
On the other hand, fuel assemblies are very expensive parts of the nuclear reactor. Initially they were used in Hungary only for 3 years, now for 4 years and soon they will stay in the core for 5 years - of course the new types should have bigger enrichment in uranium. Each year only 1/3rd, 1/4th and later 1/5th of them is replaced, therefore the change of the appropriate fuel type is a lengthy process, with mixed cores used. The authorities require that the staff should be trained to each particular core before they operate it. For this reason the simulator of the plant should be upgraded to simulate the exact behavior of each core foreseen for the upcoming next 5 years.
In the paper first I would like to give a survey on the present state of energy production and consumption in the world and after that I would like to summarize the results and practice we used at Paks NPP in Hungary: first in the evaluation of safety studies, then the working-out of new state-of-art operational procedures, later the reconstruction of the Reactor Safety System and other I&C Systems, the replacement of which - thanks to the extensive testing and tuning performed using the full-scope replica simulator - was completed during the regular re-fuelling outage of the NPP units. Finally, we had to introduce full 3D simulation of the coupled models of core thermo-hydraulics and neutron-kinetics in order to be able to simulate new reactor cores with mixed types of fuel elements. We had to do everything listed above in real-time - that was a really demanding task.
Dr Janos Sebestyen JANOSY has been head of the Simulator Development Department in the Atomic Energy Research Institute of the Central Research Institute for Physics at the Hungarian Academy of Sciences (MTA KFKI AEKI) since 1994. He obtained his MSME in Nuclear Engineering from the Moscow Engineering Power Institute, USSR, in 1973 and his MSEE in Process Control Computers from Budapest Technical University, Hungary in 1976. He was employed by Secretariat of the Ministry of Heavy Industry on Nuclear Power Plant Construction from 1973 to 1975. He has been Senior Researcher since 1977 and Senior Adviser since 2004.
Dr Janosy has published over 50 scientific publications in international journals and conferences on
His main scientific interests:
During the last decades he was the manager of the following projects:
Advances In Grid Computing: Tenfold Acceleration Of Computing Using The Internet/Grid
Frank Zhigang Wang
Centre for Grid Computing
Cambridge-Cranfield High Performance Computing Facility
United Kingdom, Email: email@example.com
Prof Wang and his group have developed a Grid Computing platform, which supports universally network applications with a speedup of 2-25. ts associated protocol is the first of its kind worldwide. Best of all, this platform requires no changes in the way users work with their applications since it conforms the existing IT infrastructures. During his presentation, he is going to show a demo of using this platform to accelerate applications, ranging from IBM DB2, MySQL, Office, Firefox Web Browser, Google Earth to Media Player. This work has won an ACM/IEEE Super Computing Award.
Evolution of network storage architecture: Storage architecture for networks continues to evolve. To deploy storage resources on the network, one can choose to “split” different components in the complete application/data path. A NAS (Network-attached Storage)  splits its filesystem via the NFS protocol (server/client mode). A NAS client mounts a file server on a remote disk to its local file tree /mnt/nfs/ via the NFS protocol, and thus applications can access it as if it were local. A NBD (Network Block Device)  splits its device driver, making a remote disk on a different machine act as though it were a local disk on the local machine appearing as /dev/nda via a pair of split device drivers. An iSCSI  splits its SCSI bus, allowing a machine to use an iSCSI initiator to connect to remote targets such as disks and tape drives on an IP network for block level I/O. Consequently the target devices appear as locally attached SCSI ones. As a variant or successor of NAS, a Grid-oriented Storage (GOS) appliance splits its specific GOS-FS protocol through introducing parallel streams and GSI (Grid Security Infrastructure) to address larger network latencies on the computational grids.
Table 1 GOS virtualizes and accelerates the access to the remote data for different distributed applications.
GOS Implementation: The GOS uses Globus XIO to communicate with the parallel stream accelerator engine during a file read/write operation. The Globus XIO framework manages I/O operation requests that an application makes via the user API. The parallel stream driver is responsible for manipulating and transporting user data. The GSI driver performs necessary messaging to authenticate a user and the integrity of the data. The TCP driver executes the socket level transport code contained within to establish a connection to the given contact string.
Key Advances: OpenOffice, MySQL, IBM DB2, Firefox browser, Media Player, Google Earth are deployed on top of GOS. These applications achieve up to tenfold accelerations (Table 1). GOS addresses the challenge of sharing files across LANs with "LAN-like" performance, which is a highly beneficial aspect to the Virtual Organizations (VOs). With the accelerated GOS-FS engine that remains fully compatible with the existing IT infrastructures, users can efficiently manipulate and move remote files and multimedia clips on computational grids.
Scientific Impact: We have proposed and implemented a new Grid-Oriented Storage (GOS) architecture , in which a disk drive or disk array with an electronic board connects it directly to the Grid. GOS products fit the thin-server categorization, with re-developed and simplified operating system, and can accelerate tenfold the access to data on the Internet/Grid. Best of all, this platform requires no changes in the way users work with their Media/Web/Office/Database applications since it conforms to the existing IT infrastructures (POSIX). The developed GOS-FS protocol is the first of its kind worldwide and a RFC (Request for Comments) is now being drafted in an attempt to include it in the Internet Protocol suite.
 Gibson, G.A., R. Van Meter, "Network Attached Storage Architecture," Comm. of the ACM, 2000
 en.wikipedia.org/wiki/NBD, 2007
 en.wikipedia.org/wiki/Iscsi, 2007
 Frank Wang, Sining Wu, Na Helian, Andy Parker, Yike Guo, Yuhui Deng, Vineet Khare, Grid-oriented Storage: A Single-Image, Cross-Domain, High-Bandwidth Architecture, IEEE Transaction on Computers, Vol.56, No.4, 2007.
Prof. Frank Zhigang Wang
Chair in e-Science and Grid Computing
Director of Centre for Grid Computing
Cambridge-Cranfield High Performance Computing Facility(CCHPCF)
Cranfield University Campus
Professor of Computing Science and Information Systems
Director of Japan Pacific ICT Centre
Chair of the Interim Board of Pacific Computer Emergency Response Team (PacCERT)
Head of School of Computing, Information and Mathematical Sciences
University of the South Pacific
The author discusses the development of new Information Communications Technologies (ICT) in the areas of e-Learning, e-Health, e-Government, e-Journalism, etc. The current Web 2.0 combined with the 3D Telepresence, Future Internet, Semantic Web Technologies provide support to creation of social networks, content retrieval and analysis, which creates foundation for the future fully automated cyberspace.
Current dynamic Internet developments and continuous demand for the ubiquitous connectivity combined with the next generation of networks contributes towards creation the future cyberspace infrastructures worldwide.
Implementation of the cyberspace in the government and corporate infrastructures, contributes towards creation of new paradigm in the decision making processes. Decisions that are currently governed by the human intelligence knowledge and intuition may be influenced by the cyber-data and processes.
Future cyberspace will ultimately impact the decision making processes by government, corporate, industrial and academic institutions worldwide. In conclusion the author promotes discussion on the social and ethical impact of the future cyberspace and ICT technologies in the context of governance vs. privacy.
Keywords: Future Cyberspace, ICT Technologies, Governance, Security, Privacy, 3D Tele-presence, globalization.
The digital revolution has spurred the rapid expansion of economic activity across the face of the planet . In this paper author discuss the unprecedented outburst of advances and innovation in Internet and Information Communications Technology (ICT) that drives the digital revolution today. Authors further discuss how innovation of ICT works, its impact on learning technologies and methodologies, and what forms of communications technologies based on current ICT can be expected in the future. Since innovation does not happen in a vacuum, the author also discusses the current technological and social factors that can accelerate or impede changes in the field of current ICT and future cyberspace. The current trends in globalization create neither a level playing field nor a truly ‘‘flat world.’’ . The Governments worldwide are focused on creating best market opportunities while educating and industrializing as quickly as possible in the face of growing competition . Attempts to gain national competitive advantage promote creation of artificial walls that may trigger potential conflicts and disagreements.
The technical development of Internet and ICT Technologies established a platform for the next generation Internet and ICT often referred to as the next generation cyberspace. The ever increasing accessibility of connectivity by anyone, to anyone, at any time, at any place from any place to any place ultimately creates a cyber-net and/or cyberspace facilitating creation, manipulation and sharing of information globally by many in real-time fashion. The current cyberspace is already changing the way we work, study, live, socialize, etc. Te future cyberspace will revolutionize the way we live, while enabling automatic real-time visualization and audio connectivity worldwide.
The next generation cyberspace combined with the future ICT technologies will drive the e-type applications, such as e-health, e-government, e-security, e-law, e-learning, e-commerce, etc., to the next level of fully automated cyberspace. The next generation cyberspace will have significant global impact on societies, economies, political and legal structures.
Some people may want to ask what will be the 22nd century cyberspace? Will it be safe, human friendly, or will be unsecure and potentially harmful to humans? Answers to these and similar questions will most likely depend on how the global team of scientists, researchers, technology developers, sociologists, educators, thinkers, engineers, lawyers, businessmen, politicians, etc., works together today and will work tomorrow. This may be a good time to start developing ICT and cyberspace related technologies that will contribute towards betterment of live for everyone. Instead of mechanizing the relationship between peoples and nations motivated by economic and/or political benefits of very selected groups.A
Professor Babulak is international scholar, researcher, consultant, educator, professional engineer and polyglot with more than twenty five years of teaching experience and industrial experience as a professional engineer and consultant.
Panel Speaker at KIZUNA WINDS Symposium in Tokyo, February 2010, Invited Speaker at Yokohama National University, National University of Electro-Communications in Tokyo, Japan in December 2009, University of Cambridge, UK in March, 2010 and 2009, MIT, USA in September 2005 and Expert-Evaluator for the European Commission in Brussels, June, 2007.
Professor Babulak is Fellow of the Royal Society for the encouragement of Arts, Manufactures and Commerce (RSA), Fellow of British Computer Society (BCS), Nominated Fellow of the IET, Nominated Distinguished Member & Senior Member of ACM, Mentor and Senior Member of IEEE, served as a Chair of the IEEE Vancouver Ethics, Professional and Conference Committee.
Professor Babulak works as Full Professor and Head of School of Computing Science and Information Systems and Director of the Japan Pacific ICT Centre at the University of the South Pacific in Suva, Fiji.
He worked as Full Professor and Head of MIS Department in Cyprus, held five Visiting Professorships in Canada (B.C. and Quebec), Spain, in Czech Republic (Prague and Pardubice). He worked as Associate Professor in California, Senior Lecturer in UK, Lecturer in Pennsylvania, Germany, Austria, Lecture and Teaching Assistant in Canada and College Instructor in Czechoslovakia. His academic and engineering work was recognized internationally by the Engineering Council in UK, European Federation of Engineers and credited by the British Columbia and Ontario Society of Professional Engineers in Canada.
Prof. Babulak is Editor-in-Chief, Honorary Editor, Co-Editor and Guest Editor. His research interests are in Future Networks and Ubiquitous Computing and QoS, E-Commerce, E-Health, IT, MIS, Applied Informatics in Transportation, E-Manufacturing, Human Centric Computing, E-Learning, Automation and Applied Mathematics.
Professor Babulak speaks 14 languages, a member of the Institution of Engineering Technology (MIET), American Society for Engineering Education (MASEE), American Mathematical Association (MAMA) and Mathematical Society of America (MMSA). Professor Babulak’s biography was cited in the Cambridge Blue Book, Cambridge Index of Biographies and number of issues of Who’s Who.
Secure Multi-party Computation for Preserving Privacy:
Problems, Techniques and Applications
Durgesh Kumar Mishra
Professor (CSE) and Dean (R&D),
Acropolis Institute of Technology and Research, Indore, MP, India,
Ph - +91 9826047547, +91-731-4730038
Chairman IEEE Computer Society, Bombay Chapter,
Vice Chairman, IEEE MP-Subsection,
Abstract: Consider a set of parties who do not have trust in each other, nor in the channel by which they communicate. Even then the parties wish to correctly compute some common function of their local inputs, while keeping their local data secure from others. This, in a nutshell, is the problem of secure multi-party computation (SMC). This problem is fundamentally in cryptography and in the study of distributed computations. It takes many different forms depending on the underlying network, on the function to be computed, and on the amount of distrust the parties have in each others and in the network.
In this tutorial, we present several aspects of secure multi-party computation like privacy of Individuals, correctness of result and network traffic reductions. We first present the definition of this problem in various situations. Our definition is drawn from the previous idea and formulizations, and incorporate aspects that were previously overlooked. We also present several problems associated with SMC. Next we show the problem of hiding the data form trusted third party (TTP) which computes the result. We present the existing solutions of SMC along with the protocols developed by us. In our first solution, we introduced a randomly selected anonymizer between the parties and the TTP to hide the data. Apart from the randomly selection of anonymizer, the party will divide the data into number of packets and then send to different anonymizer so that the entire data will not reach to a single anonymizer and the privacy of individual will be maintained. After that, we present another problem, which enables the SMC to perform the correct computation of the result as well as the authentication of computational body. We have introduced multiple TTPs instead of a single one. If there is only one TTP then its behavior can be suspicious. Using this multiple TTP concepts we have the option to choose a TTP from domain of TTPs for computation. In this method we divide the data in several packets and these packets are sent to multiple TTPs and a randomly selected master TTP will perform the computation after accumulating data from other TTPs. For the authentication of TTP, we have introduced the concepts of equivalence classes. With the help of this concept we remove the malicious TTP from the system for further computation. Finally, present the problem of dealing with adversaries in SMC and minimizing their effects. We investigate the power of adversaries in several situations. We have also minimized the complexity of network traffic in entire process of SMC.
Dr Durgesh Kumar Mishra received M.Tech. degree in Computer Science from DAVV, Indore in 1994 and PhD degree in Computer Engineering in 2008. Presently he has been working as Professor (CSE) and Dean (R&D) in Acropolis Institute of Technology and Research, Indore, MP, India. He has around 20 years of teaching experience and over 5 years of research experience. He has completed his research work with Dr. M. Chandwani, Director, IET-DAVV Indore, MP, India in Secure Multi-Party Computation. He has published more than 60 papers in refereed International/National Journals and Conferences including IEEE and ACM. He is a Senior Member of IEEE, Chairman of IEEE Computer Society, Bombay Chapter, India. Dr. Mishra has delivered his tutorials in IEEE International conferences in India and other countries. He is also a programme committee member of several International conferences. He visited and delivered his invited talk in Taiwan, Bangladesh, Singapore, USA, UK and several places in India in Secure Multi-Party Computation of Information Security. He is an author of one book and reviewer of three International Journals of Information Security. He is Chief Editor of the Journal of Technology and Engineering Sciences. He has been a consultant to industries and Government organizations like Sale Tax and the Labour Department of the Government of Madhya Pradesh, India.