Saturday, 24 November 2012

Modern medical ethics: Critical issue for 21th century | The News Tribe Blogs

Modern medical ethics: Critical issue for 21th century | The News Tribe Blogs

“If we believe men have any personal rights at all as human beings, they have an absolute right to such a measure of good health as society, and society alone is able to give them” – Aristotle

Health systems as defined by World Health Organization (WHO) “all the activities whose primary purpose is to promote, restore or maintain health”. When we move into the 21st century, the promotion and protection of human rights is gaining greater momentum. The WHO constitution 1946 stated “The enjoyment of the highest standard of health is one of the fundamental rights of every human being without distinction of race, religion, political belief, economic or social condition”

Patients, families and healthcare professionals occasionally face complicated decisions about medical treatments. These decisions may clash with a patient and/or family morals, religious beliefs or healthcare plan. In this risky situation medical ethics is not only considerate review of how to act in the best interest of patients and their family but also about making good choices based on beliefs and values regarding life, health, and suffering. In the past, only a few individual physicians devoted themselves to medical ethics. Beginning in the second half of twentieth century, the field undergoes explosive expansion and experts from numerous disciplines entered in medical ethics. The swift advances in medical diagnosis and treatment and the introduction of new technologies have created numerous new ethical problems, resulting in the maturation of medical ethics as a specialty in its own right.
Enormous development has been achieved in the medical field during the last few decades and more is projected in the following decades. Advances in diagnostic imaging and biological testing techniques as well as in medical forecasting based on genetic testing are ongoing. Advances in surgical and medical cures, organ and tissue transplantation, artificial organs, cloning, tissue culture techniques, molecular biology and information technology are reported almost daily. “Modern Medical Ethics” is based on concept derived from various disciplines, including the biomedical sciences, the behavioral sciences, philosophy, religion and law. Modern medical ethics is essentially a form of applied ethics, which seeks to clarify ethical questions that characterize the practice of medicine and to justify and weigh the various practical options and considerations. Thus medical ethics is the application general ethical principles to ethical issues. The application of such an ethic is not specific to medicine but also relates to economy, law, journalism, and their like.
Medical ethics is now not only part of the curriculum in institutes of the health professions in developed countries but also research institutes of medical ethics have been established at all levels. In developed nations the medical literature has proliferated, with numerous books and journals devoted entirely to the subject. In such countries common citizen is also vitally interested in this subject, and public lectures, newspaper articles, legal discussions and legislation on medical ethical issues are frequent. Within Canada, EU, United States, and somehow gulf countries, Modern Medical Ethics has emerged as a new professional. The individuals normally have specialized in one or more the fields of philosophy, ethics, law, religion and medicine, and serve as advisor in hospitals to physicians, patients and their families. They also effort to resolve difficult ethical questions posed to them by the medical team or by patients and their families. 

Monday, 19 November 2012

3S tech – implementation for disaster management in developing countries!

3S tech – implementation for disaster management in developing countries!

Disaster whether natural or man-made play wreckage role with the live of millions of people every year around the globe. Their aftermath is nothing but a harsh picture of destruction, death, and misery. Today it is proven truth that natural disaster can happen at any place irrespective of the developed, developing or the least developed status of a country and it is also experienced that the least developed and developing countries are impacted more severely by large scale natural disasters. It is not always possible to avoid disaster but the sufferings can be reduced by appropriate disaster management through proper disaster management tools. The advancement in Information and Communication Technology in the form of GIS, Remote Sensing, Space Technology etc, can help a great deal in planning and implementation of disaster reduction procedures.

Space-based ICT applications are playing an explicit role in providing information, information services and decision support tools for disaster management in developed regions of the globe. Procedures such as continuous information acquisition over a broad geographic area, as well as distribution of information services and applications to remote and less serviced areas, chiefly promote from this technology. The rapid development of space-based ICT and the integration of remote sensing, Geographic Information System (GIS), and satellite position system, collectively known as 3S technology, have created a solid foundation for effective disaster information and monitoring management in developed world. 3S technology functions have being extensively utilized in two broad areas of disaster management. The initial deals with uplifting awareness, and includes preparations and planning to decrease vulnerabilities while, among other things, an understanding of the process, modeling, monitoring, early warning systems, forecasting, and hazards risk mapping.

As declared by United Nations Development Programme (UNDP), 24 out of 49 least developed countries, most of which are in Asia and the Pacific, face high levels of disaster risk, therefore the technology that is enabled by satellites holds great potential to meet significant needs in developing countries. Specially, satellite-based remote sensing, communication and navigation capabilities can provide valuable services in the developing world. Unfortunately, the potential of these technologies is not fully met due to different obstacles that prevent developing countries from making use of such technology. These barriers include lake of access of funding, expertise, infrastructure, equipment and education. According to research done with the International Telecommunication Union (ITU) has proven that an investment of $1 in ICTs used for disaster management through monitoring and response could save $14-$22 for rehabilitation after the disaster.

In recent years Pakistan has suffered a series of natural disasters, including the 2005 earthquake and floods in 2010 and 2011. These calamities killed thousands of citizen and cost millions by destroying large-scale infrastructure, housing, live-stoke, agriculture and other assets. Since Pakistan is situated on major earthquake fault lines, the likelihood of similar tragedies in the future remains significant. Pakistan still lack in a systematic and organized approach towards managing the effects of natural disasters. However, the National Disaster Management Authority (NDMA) which is the executive arm of the National Disaster Management Commission (NDMC) headed by Prime Minister, has been assigned the task of coordinating the disaster risk management at the national level, implementing disaster risk management strategies, mapping the hazards, developing guidelines and ensuring the establishment of Emergency Operation Centers (EOCs) at provincial, district and municipal levels. Despite establishment of this organization assigned with apparently various tasks, disasters in Pakistan are barely managed efficiently. According to the government, a lot of work is under way but, the last budget couldn’t allocate significant money for IT or telecommunication sector to build efficient disaster management structure to overcome the natural disasters. Keeping the current ICT infrastructure of the country in mind, especially in the rural areas, such digital rescue operation is a myth, away from reality.

In the light of above facts, it is obvious that Pakistan is in ominous need of an organized disaster management programme to face the emergency situations. Government of Pakistan must reinforce its disaster management policies and build up institutions to not only tackle such situations but also make them best use. It is high time that the government as well as citizen of Pakistan play their own respective role to bring about a positive change!


“A nation that continues year after year to spend more money on military defense than on programs of social uplift is approaching spiritual doom”. – Martin Luther King Jr.


ICT in education: Transforming the learning environment

ICT in education: Transforming the learning environment


“Those who know cannot be like the ones who do not know. Of course, knowledge and ignorance are like light and darkness which can never be alike” – Holy Quran

Education is the most important aspect that plays a principal role in human development. It endorses a productive and informed citizenry and creates opportunities for the socially and economically deprived segment of society. The development of a nation depends on its system of education, as it develops capacities in the individuals and enhances inner-strengths – intellectual, political, social and economic against, domination, exclusion and discrimination. Educational development occupies an important place at the apex of the all development segment pyramid and helps to develop the cream of the society – a selected group of individuals – physically, intellectually, emotionally and socially.

“There is no doubt that the future of our State (Pakistan) will and must greatly depend on the type of education we give to our children and the way in which we bring up as future citizens of Pakistan” – Quaid-e-Azam Muhammad Ali Jinnah (All Pakistan Education Conference, Karachi)

Education is a dynamic process. Every human being is born with talents. Education exploits these talents in a healthy, integrated and balanced manner. According to the UNESCO Commission on Education “educational institutions are a decisive factor in training men to contribute to the development of the society, to play an active part in life and in properly preparing men for work”. Therefore, spread of education is a sin qua non both for modernization and sustenance of democracy and also to make man “be himself” and “to become himself” (UNESCO, 1979).

Innovation in science and technology is transforming the world at an astonishing rate. Development in computing and communication, in particular, are helping to accelerate these challenges. As we move into 21st century, we observe ICT has changed many aspects of the way we live. If anyone tends to compare such fields like banking, business, engineering, medicine, tourism, law, the impact of ICT across the past two or three decades has been enormous. The way these fields function today is enormously different from the ways they operated in the past. In recent years many hypothetical and practical efforts have also been made to assess the impact of ICT on educational reform process for both access of education and quality of education because, among all the development sector education sector is primarily the most attentive sector connects to improve the efficiency, accessibility and quality of the learning process.

People in present society are becoming more and more familiar with ICT as ICT refers to ‘the technology that enables communication and transmission of information’. When implementing the ICT in the education sector, there are considerable challenges such as cost, internet access, training and policy issue, and each issue has its own ways of addressing which is valuable to apply around world. However, all these challenges for development through applying ICT to the education sector must consider the environment that each country faces, because the situation of each nation is totally different from each other. ICT can transform the learning environment in the following ways:

ü  Active learning: ICT can increase learning mobilization tools for examination, calculation and analysis of information, thus can offer a platform for student inquiry, analysis and construction of new information.

ü  Collaborative learning: ICT can support learners through interaction and cooperation among students, teachers, and experts regardless of where they are. It can also provide opportunity to work with people from different cultures.

ü  Evaluative learning: ICT can permit learners to explore and discover rather than merely listen and remember and it can recognize that there are many different learning pathways and many different articulations of knowledge.

Consequence of education for human development does not need any clarification. For developing countries like Pakistan ICTs have the potential for increasing access to and improving the relevance and quality of education. Government of Pakistan accepts education as the fundamental right for its citizens, yet it has an unimpressive track record of provision of literacy at the grass-root level. Consequently, the reality of the Digital Divide – the gap between those who have access of technology and those who do not – means that the prologue and inclusion of ICTs at different levels and in various types of education will be most challenging activity. Failure to meet the challenge would mean a further widening of the knowledge gap and the deepening of existing economic and social inequalities between the developed and developing world.
“If you treat an individual as if he were what he ought to be, and could be, he will become what he ought to be and could be.” – Johann Wolfgang Von Goethe



Technology transfer – an approach for national progress

Technology transfer – an approach for national progress
The rapid expansion of science and technology started about three hundred years ago in seventeenth century. Since then science has built up an extensive body of knowledge about the world and about the history of man, the earth, the solar system and the universe. Science and technology are crucial to the development prospects of developing countries in two different ways. First, science and technology can provide tools that help alleviate the specific problems that afflict many poor countries and delay their development prospects, such as disease, lake of infrastructure (information, transport, energy, etc.). Second, science and technology are central to the dynamics of economic development itself. Economically successful countries are those that are able to turn technical innovation into economic productivity. Successful technology transfer from developed countries to developing countries can improve the live and economic growth as technology transfer is the process of sharing of knowledge, skills, methods of manufacturing, samples of manufacturing and facilities among governments and accessible to a wider range of users who can then further develop and exploit technology. At the same time, technology transfer is not a process that characterizes a stage in the development of the Third world but, it occurs constantly in the First World as-well. In theory, technology transfer is closely related to the diffusion of innovations.
The inclination of developed countries to facilitate access to and transfer of technology to developing countries is reflected in a number of international agreements. These agreements identify that technology transfer to developing countries is important to facilitate their integration into the global economy, and meet their international obligations and commitments. They also recognize that technology transfer is important in facilitating the creation of a sound and viable technological base in developing countries. Since the time of independence in 1947, science and technology in Pakistan has seen many ups and downs throughout its history because Pakistan inherited very few folks those were capable of scientific development and technological research. But, in the past few decades, Pakistan has made noticeable progress in technological development. The existing institutional skeleton for Science and Technology in Pakistan comprises governmental and non-governmental institutions. In 1964, the Scientific and Technological Research Division was established for coordination and implementation of national science and technology policy, and to promotion of research and utilization of the results for coordination of utilization of scientific and technological manpower.
Technology transfer is an important issue and key component of economic and social development of developing nations. The successful implementation of technology depends not only on good technical specifications, but also on the right social, political, and institutional environment. The existing type and capacity of educational institutes in developing countries are not sufficient. Educational institutions in developing countries should be provided with modern computer and information technologies (ICT) facilities to enable hands-on practical training and special attention should also be paid to on-the-job training. The training of technologists and operators which forms the basis for the implementation of a new technology has to be given a high priority.
It is obvious that the technology transfer from the developed countries to developing countries is not a straightforward mechanism. Keeping in view the importance of problems relevant to the technology transfer, many developing countries have established institutions for public support. The situation demands that the government of Pakistan must establish institutions, with proper technical manpower to deal with the problems relevant to foreign technology transfer. The institutions may facilitate the local industry and other technology users in the following ways:
·         Locating the proper information sources of foreign technologies.
·         Facilitating industrial sector in pricing the international technology.
·         Establishing a complete coordination between technology venders and users.
·         Conducting research studies for technology transfer in various sectors.
·         Making proper arrangements for assessment and evaluation of technology in terms of its effect on environment and society.

On the basis of above arguments it is important that for successful technology transfer there is an urgent need for the establishment of some institutions with appropriate policy across Pakistan to deal with various aspects of technology transfer to maintain scientific growth in country.


Health ICT towards the millennium development goals

Health ICT towards the millennium development goals
"The implementation of health ICTs in developing countries particularly Pakistan has been hampered by traditional obstacles like poor infrastructure, lack of resources, and insufficient political commitment and support. This can be appropriately summarized as follows, as the “Four Cs” – connectivity, cost, capacity, and culture. Presently, the problem of improving health-care delivery in developing countries is more about the impartial distribution of available resources to all areas of the health system than about technology. Technologies exist that would help doctors working in isolated rural villages to access to up-to-date medical information and communicate with colleagues, and even to diagnose illnesses and treat patients". 

Saturday, 10 November 2012

Radiology – the future of diagnosis and treatment | The News Tribe Blogs

Radiology – the future of diagnosis and treatment | The News Tribe Blogs

The international imaging community is organizing International Day of Radiology (IDoR) on Thursday 8th November 2012 to raise awareness of the value of radiology to safe patient care, and to improve understanding of the vital role of radiologists in the healthcare practice. The purpose of this day is to increase knowledge of radiographic imaging and therapy, which play a crucial role in diagnosis and treatment of patients. World Radiology Day marks the anniversary of the discovery of X-rays in 1895, on this day Professor Wilhelm Roentgen discovered X-rays and within three months of discovery radiographs were generated in major cities. In honor of his discovery, the European Society of Radiology (ESR) along with the Radiological Society of North America (RSNA) and American College of Radiology (ACR) decided 8th November should be marked yearly with a devoted day of observance.

Medical imaging is one of the most thrilling and progressive discipline in healthcare and a field of great activities in terms of technological and biological research. Early radiology was embedded in morphology, namely skeletal morphology. The change towards image of physiology of the human body began with radiology and nuclear medicine. In the midst of the excitement brought about by the Roentgen’s discovery, Becquerel discovered radioactivity in the early 1896. Similar to the discovery of x-rays, phosphorescence was accidental thus, began the dawn of the medical imaging age. Subsequently, numerous scientists for instance Curies and Rutherford had contributed to the advancement of radiology. Thus, the use of single-photon emission computed tomography (SPECT) and to a greater extent positron emission tomography (PET) to display functional abnormalities not detected by other imaging tools have made assessment of treatment practicable.

In the early years, radiographs were initially made onto glass photographic plates which were coated with emulsion only on one side. In 1918, Eastman introduced film coated with emulsion on two surfaces. Radiograph at this time was focused on imaging of extremities, mainly to detect fractures and to localize position of bullets. Further development brought about an intravenous contrast agent marketed for urinary tract radiograph in 1927. The next development involved the use of fluorescent screen, an x-ray tube, and x-ray table and red goggles and required the radiologist to stare directly into the screen so that x-ray images could be displayed in real time. This was a rather primitive method as the fluorescence emitted was very dim. The iodine-based contrast arteriogram in a patient was reported in 1929 by Dos Santos, more or less 34 years after the discovery of x-ray. In the late 1980’s and early 1990’s however, two essential technologies have greatly impacted the evolution of angiography; moveable multiple-angle C-arm fluoroscopy and digital image acquisition. By the 1970’s ultrasound (US) and computed tomography (CT) had arrived displacing angiography as the supreme imaging tool in radiology. By the 1990’s duplex ultrasound, CT angiography and Magnetic Resonance (MR) angiography began to replace diagnostic arteriography for the direct study of vascular pathology. In most radiology departments today, catheter-based angiography is reserved mainly for diagnosis of atherosclerotic vessels and as an adjunct to interventional procedure. The emergence of three powerhouses imaging tools namely ultrasound, computed tomography and magnetic resonance imaging have revolutionized the care of patients across the continuum of medicine and surgery.

Radiology is now often referred to as ‘imaging’ reflecting the fact that it is no longer dependant on x-rays alone. Over the years, ultrasound has stood the test of time proving to be a safe, reliable, portable and cheap imaging modality. In 1972 the cross-sectional imaging became a catch phrase; this was attributed to the invention of computed tomography. The earliest CT scanners were limited to imaging of the head, by 1976 the technology had evolved to whole body scanners, and by the 1980’s CT Scans had gained worldwide acceptance. Today there are an estimated 600,000 locations around the world where this diagnostic tool is in use. The prototype CT Scanners took roughly four minutes of lapsed time to acquire a single image. Currently, modern units produce images in less than 0.5 seconds. The advent of CT had an enormous effect on our ability to ‘SEE’ inside the body and immediately changed the practice of medicine; the momentum created by CT scanners fueled the commercial development of MRI systems. In its infancy, many thought that MRI would have a limited impact because of its high cost, the technical difficulties associated with it and the belief the CT scanning was a superior method of imaging. MRI has quickly become the primary imaging method for brain and spine imaging as well as functional imaging of the heart.

Computers and the digital world have impacted the science of Radiology bringing it to what it is today. The advancement of artificial intelligence in the last 25 years has created an explosion of diagnostic imaging technique. These techniques have now been developed for digital rather than photographic recording of conventional radiographs. In the early days, a head X-ray would require up to 11 minutes of exposure time but now digital radiographic images are made in milliseconds while reducing the radiation dose to as little as 2% of what was used for the 11 minutes head examination 10 years ago. The resolution achievable by the different imaging methods may be classified as spatial, contrast or temporal. Spatial resolution is the ability of a system to resolved anatomic detail. Contrast resolution is the ability of the system to differentiate different tissue especially to distinguished normal from pathological tissue. Temporal resolution is the ability of the modality to reflect either changing physiological events such as cardiac motion or disease remission or progression as a function of time. Each imaging modality has its strength and weaknesses much to frustration of hospital administrators no single method will solve all diagnostic problems and the fusion of knowledge gleaned from different modalities would serve our patient best.

Medical Tourism: Patients beyond the Borders | The News Tribe Blogs

Medical Tourism: Patients beyond the Borders | The News Tribe Blogs

"Medical tourism has grown in a number of countries but, the main region is Asian countries such as Thailand, India, Malaysia and Singapore. Medical tourists not surprisingly are mainly from rich world countries where costs of medical care may be very high, and where the ability to pay for alternatives is also high. American, European and Canadian patients favor India, Thailand and Malaysia. India ranks second in the world in medical tourism, with Thailand leading the pack".


Friday, 24 August 2012

Importance of Biomedical Engineering in Pakistan

The 21st century is technically called the Biological Century. World is changing globally step by step and the modern era is having the application of engineering in almost every field of science especially biological science. More technological advancement in the medical and industrial area is predictable with heavily funded research programs ongoing in most countries of the world. Development in the field of biology and medicine, such as human genome sequencing and research to create cell and organ functions, have lead to a serious change in many industrial segments and strengthened the medical engineering profession. Although the conventional areas of engineering and other technology innovations will continue, more new opportunities will come up in Biomedical Engineering and in the field of biology, medicine, health and delivery of healthcare. The Biomedical age is still in the embryonic stage, rising steadily as we proceed to develop the field of Biomedical Engineering.

Pakistan, like many other developing countries, is facing tribulations in health care deliver. The health system in Pakistan is currently going through several reforms at the federal, provincial and district level particularly to improve the delivery of health service to the population.  Although, our nation’s health care providers – surgeons, physicians, nurses, and others work hard to provide life-saving and life-improving care to millions of Pakistanis but, the level of quality and efficiency of care varies significantly across the country. Good health as people know from their own experience is a critical part of well-being. With growing healthcare awareness, increase in population and greater affordability for optimized healthcare, the need for qualified Biomedical Engineering professionals is increased in Pakistan therefore; the suitable and applicable structure is required to bridge the gap between medical technology and patient care. Currently, the status of Biomedical Engineering in Pakistan is far from satisfactory. Federal and Provincial government should make some policies to introduce Biomedical Engineering departments in hospitals and other healthcare centers. Biomedical Engineers can play a key role in the delivery of healthcare both in private and government sectors.
In Pakistan, Biomedical Engineers must be employed in universities, industry, hospitals, research centers for education and medical institutions, teaching and government regulatory agencies. Biomedical Engineers must be employed in government positions for product testing and safety, besides establishing safety standards for devices. In hospitals environment, Biomedical Engineers can provide recommendation and supervision in the selection of medical equipment and they can also manage the performance of the equipments on a continuous basis. A well establish hospital cannot offer quality of healthcare without having Biomedical Engineering department, particularly hospital that is involve into secondary and tertiary care, because such hospitals are full of medical equipments, instruments, devices, and machinery that can be operated, calibrated and maintained by Biomedical Engineers through appropriate and skilled manners.

Consequently, Biomedical Engineering has a huge impact on the world we live in today. There are now an array of medical devices and machines that can both improve health and save lives. Indeed, medical care will be strongly influenced by the revolutionary changes brought about by Biomedical age in term of quality, technology, cost, and life style specially in developing countries like Pakistan. Medical care in the Biomedical age will flexibly meet individual, diversified, and comprehensive needs. Biomedical Engineering must adjoin with doctors and healthcare providers in Pakistan for better health.


Importance of Biomedical Engineering in Pakistan | Technology Times




Sunday, 19 August 2012

Women in ICT: Myth and reality in developing regions

Women in ICT: Myth and reality in developing regions | The News Tribe Blogs

Nowadays it is well-known that Information and Communication Technologies (ICTs) can give new opportunities for development to everybody. Yet, lack of access to them in developing countries create difficulties to people for individual and social advancement as well. Women in developing regions occupy the highest level of the digital gender divide because of huge responsibilities for their families and kids at home that causes challenges for them in education, employment, participation in governance and business. Women have always had important role in educating our young children and developing our societies it is obvious that empowering them with new tools and values will surely help them contribute to the competitiveness of our economies and to building new generation which can fully understand the new challenges of the technological world of the developing regions. Needless to say that today there are many barriers to women’s access to ICT especially in developing regions because ICT is considered by the majority of people as a primarily male industry. Women are underrepresented among ICT users and very rarely work as developers.  In some countries cultural norms and even concerns over personal safety may make it difficult for women to attend training courses.
Over the last twenty years many intervention programmes have been implemented to increase the number of women in the Information and Communication Technology (ICT) profession. In 1995, the United Nations Commission on Science and Technology for Development (UNCSTD) recognized the growing influence of ICT in development and the importance of women’s participation in discussions regarding its integration globally. To that end, they established a Gender Working Group to address the significant gender issues from access to control. The United Nations Division for the Advancement of Women (DAW), the International Telecommunication Union (ITU) and UN ICT Task Force Secretariat released a report in 2002 that focused on ICTs as tool to advance and empower women. When the World Summit on the Information Society (WSIS) was established, a Gender Caucus was created to ensure women had a seat at the table and a voice in the room. The research showed that in developing countries women enjoy fewer benefits from ICTs than men, it is also found that gender-based obligations, societal biases, and even physical strength can restrict women’s ability to learn about or use of new technologies. i.e;

·         Women are responsible for running households, they are less mobile and have less free time than men, and therefore cannot easily take advantage of training and other resources;
·         Male students discourage female students from accessing computers in labs by pushing them out of line;
·         ICT use can shift family dynamics and the balance of power, causing conflict in the home which can lead to arguments, violence, divorce, and even death;
·         Women often feel uncomfortable or annoying when visiting internet cafes on their own.


Women in developed countries are using ICT to expand their mission, drive their passion to improve the world form the grass roots. There is a growing reality that women’s engagement in ICTs is important for multiple forms of development, including social and political justice as well as economical development. However presently, the ICT sector does not take full advantage of female talent in developing countries. This is bad for the sector and bad for those women who could create new opportunities for themselves and their families with the ICT jobs that deliver better salaries and career paths than most other sectors. Despite the obvious benefits, many women never consider a career in ICTs particularly in developing countries because there is a lack of awareness among students, teachers and parents on what a career in ICT could offer.


Telehealth solution for developing countries

Telehealth solution for developing countries | The News Tribe Blogs

"In 1994, a multi-disciplinary group of young professional from 24 different countries gathered and wrote a visionary report entitled, “Global Access Telehealth and Education System” (GATES). This report detailed how to utilize information and communication technology (ICT) to provide health and education services to the entire world, in particular developing countries. Developing countries face various problems in the provision of medical service and health-care. Many developing countries have inadequate health-care and medical services and they also suffer from shortage of doctors and health-care professionals. Telehealth is a relatively newer concept as far as most of the developing countries are concerned".


BioMedical engineering: Delivery challenges in developing countries

BioMedical engineering: Delivery challenges in developing countries | The News Tribe Blogs

BioMedical engineering (BME) is a multidisciplinary field that spans interdisciplinary boundaries and connects the engineering and physical sciences to the biological sciences and medicine in a multidisciplinary setting, to develop or apply new technologies in patient oriented research and clinical healthcare. 


Health for all: e-health in Pakistan perspective | The News Tribe Blogs

Wednesday, 11 January 2012

Telemedicine: A Part of Medical Team


The convergence of information and communication technologies (ICT) for improving health system through telemedicine addresses both changes in the access of healthcare information and services as well as wider dissemination of healthcare related skills and specialist expertise into community, into home and ultimately the individuals. The use of the internet and high-tech communications in health care has led to new approaches to medical treatment and to challenging legal questions. The health care providers, hospitals, pharmaceutical companies, insurers and their legal counsels are exchanging medical information through web-portal access using telemedicine. The application of telemedicine in health system improvement can be classified as the use of e-health in the provision of health services at a distance (tele-health), management of clinical and administrative information (health informatics), and sharing information with health care providers, patients, and communities (e-learning). Proven benefits of telemedicine include improved access to care, enhanced quality of services, and reduced costs of care for patients and health care systems. However, use of telemedicine within or between institutions involves a number of factors that require appropriate planning. Many of these issues cannot be addressed without the support of well-defined policies, rules, standards, or guidelines at the institutional, jurisdictional, and global levels. It is important for the planners of telemedicine at different levels to develop policies that could facilitate the adoption of telemedicine and prove its success through improvement in services and change in public health status.
Doctors have recently gained extensive knowledge of using telemedicine applications for consultations, education and training, and conferences. What is still lacking is systematic evaluation of these new approaches compared with traditional measures. Trials involving consultations for diagnostic, monitoring, and interpretative purposes should be blinded and multicentred, and should include tests of patients satisfaction as well as macro-economic considerations. The quality of educational programmes and conferences should be documented and compared with traditional teaching methods. International standards need to be developed for such evaluations, to allow valuation between trials performed at national and international levels. Pakistan is in a good position to contribute to these developments because of a well-integrated health care system and excellent telecommunication facilities. Through telemedicine, Pakistan possibly will resume a leading global position in the use of advanced information technology. There are still significant gaps in the evidence base between where telemedicine is used and where its use is supported by high-quality evidence. Further well-designed and targeted research that provides high-quality data will provide a strong contribution to understanding how best to deploy technological resources in health care. The identification of a number of critical requirements for the successful implementation of ICT projects and programs in the health sector of developing countries includes:

  1. purpose, strategies, and scope of services to be provided;
  2. audiences, customers, and users (targeted populations);
  3. value of health and healthcare to the individual and community;
  4. current ways to assess individual and collective health problems (community health);
  5. needs of the individual, community, and nation;
  6. institutional user needs and commitments; and
  7. competencies of the organization implementing or hosting the ICT system. 
ICT have clearly made an impact on health care, includes:
  • improved dissemination of public health information and facilitated public discourse and dialogue around major public health threats; 
  • enable remote consultation, diagnosis and treatment through telemedicine;
  • facilitate collaboration and cooperation, among health workers, including sharing of learning and training approaches;
  • support more effective health research and the dissemination and access to research findings;
  • strengthened the ability to monitor the incidence of public health threats and respond in a more timely and effective manner; and
  • improve the efficiency of administrative systems in health care facilities.

Telemedicine now has the potential to make a difference in the lives of sick people. Depending upon the level of technology employed, telemedicine can reduce professional isolation of the rural primary practitioner in several ways. For instance, two-way interactive video consultation with specialists links the isolated practitioner with the specialist community of a large medical care. This virtual support system and contact with professional colleagues should enhance the integration of the rural or otherwise isolated practitioner. However, it must be noted that these contacts are only temporary, will occur only sporadically, and depend on the level of telemedicine technology employed. Therefore, the extent of the integrative possibility of telemedicine remains to be determined. The technology also has the potential to link the primary practitioner with on-line services which provides the opportunity to review the latest medical literature, thereby strengthening links to the professional medical community and improving the quality of care for the rural patient.

Wednesday, 7 December 2011

ADVANCING HEALTH-CARE SYSTEM PERFOMANCE WITH GEOINFORMATICS


Abstract:
Health-care systems represent hearty and demanding information environment that requires comprehensive infrastructure capable of addressing inadequacies in existing systems. Although several modern geo-technologies have been available for over three decades, most health-care systems and public health agencies have incorporated only a limited number of these innovative technologies into their routine practices. Understanding geo-informatics capabilities in health-care industry as a decision support system in responding to health-care challenges associated with assessment, assurance, and policy development is needed. Geographic information systems (GIS) and analyses based on GIS have become widespread and well accepted. GIS is not the complete solution to understanding the distribution of disease and the problems of public health but is an important way in which to better illuminate how humans interact with their environment to create or deter health.

Keywords: Health-care system, Public health, Geo-technologies, Geographic information system (GIS)


Introduction:
Geography is important in understanding the dynamics of health causes and spread of diseases. Any attempt to advance quality improvement in health-care requires geospatial consideration and implementation of geo-informatics science and technology system (GIS), global positioning system (GPS), and remote sensing applications. Recent progress in geo-technologies has intensified the need for evidence-based spatial decision support systems (SDSS) in health-care practices. A GIS integrates data from multiple sources, providing the ability to analyze and visualize how data relates over space and time. The use of GIS requires the creation of geospatial database, appropriate hardware and software acquired, applications developed, and all components installed, integrated and tasted before users can use it. This paper provides a snapshot of the benefits of GIS and related technologies and how they possibly use in health-care systems.

Geo-informatics in health care:
Health geo-informatics combines spatial analysis and modeling, development of geo-databases, information systems design, human-computer interaction and networking technologies to understand the relationship between people, environments, and health effects. GIS provides the opportunity of linking databases to maps, creating visual representation of statistical data, and analyzing how location influences features and health events on the earth’s surface.
Within the last decade, the world has experienced some catastrophic events that clearly provide evidence of the importance of state-of-art health information system (HIS). Compared with other public services as natural resource, urban planning and transportation, it is evident that the full capacity of GIS in health-care management has not been fully explored. There is limited evidence that GIS are being formally considered or regularly used in strategic decision-making in any major health-care planning system. Several initiatives that advocate the inclusion of GIS operations at different stages of health-care planning and management have been noticed. In 2003, GIS was recognized as an emerging information technology that can be used to enhance the ability to prepare for and respond to public health emergencies. Several organizations including the WHO are committed to support countries in the adaptation and integration of GIS within their respective health-care programmes. Successful adoption of GIS by health-care managers and policy-makers depends on understanding the spatial behaviors of health-care providers and consumers in the rapidly changing health-care landscape and how geographic information affects these dynamic relationships.

Geo-informatics in Emergency Response:
In most cases, linking emergency resources with victims creates a geo-logistical challenge. To address this challenge, an integrated Advanced Emergency Geographic Information System (AEGIS) can be developed and accessed anywhere. AEGIS allows all emergency resources to be fully coordinated as a web-based situational awareness system for use in all emergency medical services. AEGIS monitors and maps the location and status of emergencies, locates victims and emergency response personnel, and tracks other factors such as prevailing weather conditions that can impact emergency response on a real-time basis. AEGIS overlays traffic congestion and accidents on freeways to plot the fastest routes to area trauma centers. All authorized emergency responders can access AEGIS via the Web or by using a basic cell phone or in-vehicle unit.

GIS provides high quality patient care management:
Ensuring delivery of high quality care requires care givers to have the necessary accurate and timely information and the ability to visualize them at their fingertips. Hospitals that have developed patient/bed management systems that operate during non-surge periods are in a better position to provide critical information to local incident management during unanticipated disaster surges. This system facilitates capturing of vast array of information of patients’ admittance, switching rooms, discharge, and moving from in-hospital to outpatient care. On a broader scale, linking hospitals in local, statewide and multi-state systems will enable health-care capacity and the ability to adequately prepare and respond to mass-casualty events and other regional public health emergencies.

Defining suitable locations for health-care services:
Access to health care is a significant factor that contributes to a healthy population. Accessibility and utilization of health care depends largely on having the appropriate health-care resources in the right place at the right time. GIS has been used in a number of situations to estimate the optimal location for a new clinic or hospital to minimize distances potential patients need to travel taking into account existing facilities, transport provision, hourly variations in traffic volumes and population density. GIS applications demonstrate sophisticated use of health information to enhance facility utilization, improve distribution of preventive and curative care, and provide evidence-based rationale for targeted assistance and service delivery.

Resources required implementing a GIS:
Developing a GIS requires investment in computer hardware, GIS software, networking environment, data procedures, and trained staff. Staffing for a GIS programme is critical as it is not easily feasible to directly expand the local health-care staff positions to fill the GIS need. Areas where expertise is needed include GIS project management, GIS database skills, and application development. Training of the health-care workforce in general computing, database principles, and GIS are essential for increasing efficiency of use.


Conclusion:
Several dimensions of health and human services can benefit from the adoption of geo-informatics as a way of improving health and be in a better position to prevent and respond to public health emergencies. There is a need for health care systems to create new types of information that are both clinically relevant as well as place and time sensitive in response to large scale emergencies. When appropriately implemented, GIS could potentially act as a powerful evidence-based practice tool for early problem detection and solving while modifying clinically and cost-effective actions in predicting outcomes, and continually monitor and analyze changes in health-care practices.

Reference:


  • Ricketts TC. 1994. Geographic Methods for Health Services Research: A Focus on the Rural-Urban Continuum. Lanham, MD: Univ. Press Am. 375 pp.

  • Higgs G, Gould M: Is there a role for GIS in the new NHS? Health Place 2001, 7(3): 247-59.

  • Jenks RH, Malecki JM, 2004. GIS – a proven tool for public health analysis. J Environ. Health, 67(3), 32-34.

  • Rushton G, Elmes G, McMater R. 2000. Considerations for improving geographic information research in public health. URISA J. 12(2): 31-49.

Sunday, 30 October 2011

Medical Image Acquisition: Static to Digital World

Overview:
Early radiology was embedded in morphology, namely skeletal morphology. The change towards image of physiology of the human body began with nuclear medicine. With this transformation approach the ability to not only display presence of diseases but also the mechanism of disease and the biology of treatment. In the midst of the excitement brought about by the Roentgen’s discovery, Becquerel discovered radioactivity in the early 1896. Thus began the dawn of the nuclear age. Similar to the discovery of x-rays, the discovery of phosphorescence was accidental. Becquerel had placed some photographic plates in a drawer with some crystals of uranium. Upon retrieving the plates, he found that the plates had been exposed. He deduced that exposure must have been from rays of a radioactive source i.e; the uranium crystals. Over the years numerous scientists such as the Curies and Rutherford had contribute to the advancement of nuclear medicine. The use of single-photon emission computed tomography (SPECT) and to a greater extent positron emission tomography (PET) to display functional abnormalities not detected by other imaging tools have made assessment of treatment feasible.


Background:
In the early years, radiographs were initially made onto glass photographic plates which were coated with emulsion only on one side. In 1918, Eastman introduced film coated with emulsion on two surfaces. Radiograph at this time was focused on imaging of extremities, mainly to detect fractures and to localize position of bullets. This was due to fact that bone, soft tissue and dense foreign bodies provided the only contrast between materials. In 1910, orally administered contrast medium (bismuth nitrate later replaced by barium sulphate) was used to image the gastrointestinal system. Further development brought about an intravenous contrast agent marketed for urinary tract radiograph in 1927.
The next development involved the use of fluorescent screen, an x-ray tube, and x-ray table and red goggles and required the radiologist to stare directly into the screen so that x-ray images could be displayed in real time. This was a rather primitive method as the fluorescence emitted was very dim. The first iodine-based contrast arteriogram in a patient was reported in 1929 by Dos Santos, approximately 34 years after the discovery of x-ray. However without the benefit of the image intensifiers at this time, arterial access was obtained via a blind tranlumbar puncture. The emergence of image intensifiers gave a much-needed boost to this flagging enterprise. Greater steps were taken when Seldinger introduced a safer, simpler and more effective method of accessing the femoral artery. Despite the advent of the Seldinger technique, real advances in diagnostic angiography were still stunted, as fluoroscopy remains primitive. In the late 1980’s and early 1990’s however, two essential technologies have greatly impacted the evolution of angiography; moveable multiple-angle C-arm fluoroscopy and digital image acquisition.

The Power of Three:
By the 1970’s ultrasound (US) and computed tomography (CT) had arrived displacing angiography as the supreme imaging tool in radiology. By the 1990’s duplex ultrasound, CT angiography and Magnetic Resonance (MR) angiography began to replace diagnostic arteriography for the direct study of vascular pathology. In most radiology departments today, catheter-based angiography is reserved mainly for diagnosis of atherosclerotic vessels and as an adjunct to interventional procedure. Early imaging studies were projections of 3-Dimensional (3D) body parts displayed as if a steamroller as in our favorite cartoons had flattened the human body. This results in much overlap of the body parts making interpretation of disease difficult. The emergence of three powerhouses imaging tools namely ultrasound, computed tomography and magnetic resonance imaging have revolutionized the care of patients across the continuum of medicine and surgery. Radiology is now often referred to as ‘imaging’ reflecting the fact that it is no longer dependant on x-rays alone. Over the years, ultrasound has stood the test of time proving to be a safe, reliable, portable and cheap imaging modality. In 1972 the cross-sectional imaging became a catch phrase; this was attributed to the invention of computed tomography (also known as computed axial tomography or CT scan). The earliest CT scanners were limited to imaging of the head, by 1976 the technology had evolved to whole body scanners, and by the 1980’s CT Scans had gained worldwide acceptance. Today there are an estimated 600,000 locations around the world where this diagnostic tool is in use. The prototype CT Scanners took roughly four minutes of lapsed time to acquire a single image. Currently, modern units produce images in less than 0.5 seconds. The advent of CT had an enormous effect on our ability to ‘SEE’ inside the body and immediately changed the practice of medicine; the momentum created by CT scanners fueled the commercial development of MRI systems. In its infancy, many thought that MRI would have a limited impact because of its high cost, the technical difficulties associated with it and the belief the CT scanning was a superior method of imaging. MRI has quickly become the primary imaging method for brain and spine imaging as well as functional imaging of the heart.

Inflowing the Digital World:
Computers and the digital world have impacted the science of Radiology bringing it to what it is today. The advancement of artificial intelligence in the last 25 years has created an explosion of diagnostic imaging technique. These techniques have now been developed for digital rather than photographic recording of conventional radiographs. In the early days, a head x-ray would require up to 11 minutes of exposure time but now digital radiographic images are made in milliseconds while reducing the radiation dose to as little as 2% of what was used for the 11 minutes head examination 10 years ago.
The resolution achievable by the different imaging methods may be classified as spatial, contrast or temporal. Spatial resolution is the ability of a system to resolved anatomic detail. Contrast resolution is the ability of the system to differentiate different tissue especially to distinguished normal from pathological tissue. Temporal resolution is the ability of the modality to reflect either changing physiological events such as cardiac motion or disease remission or progression as a function of time. Each imaging modality has its strength and weaknesses much to frustration of hospital administrators no single method will solve all diagnostic problems and the fusion of knowledge gleaned from different modalities would serve our patient best.