Saturday, October 5, 2019
Cineplex Entertainment - The Loyalty Program Case Study
Cineplex Entertainment - The Loyalty Program - Case Study Example The essay aims to outline the recommendations of Sarah Lewthwaite, the market director for the Cineplex Entertainment, to the committee of senior executives. The presentation contains persuasive arguments regarding loyalty program development campaign, considering the movie industry, which is having inconsistent revenues each year. Cineplex entertainment was founded in 1979 as a small chain of movie theaters. In 2005 Cineplex acquired its largest supplier and become the Canadaââ¬â¢s largest film exhibiter. The market share of Cineplex after that acquisition jumped up to 40 million visits of customer per day. Cineplex also started giving value added services to its customer like food at branded concession counters, arcade games, etc. In the same year they also expanded their strategies and entered into new markets which generated customer traffic and boosted their revenue per day. Although the revenue appreciated a lot in the year 2005 compared to the previous years but cost of operation got high as well which shrined the net income of the company. Cineplex Entertainment had issued Elite cards to the customer which offered them rewards like free movie viewing after they accumulate a certain number of points. Cineplex had no CRM capabilities which could help them in driving customer traffic. According to the survey in 2005, 95% respondents wanted to have movie reward offer back. Considering that option for further investment Sarah Lewthwaite gave option of starting a loyalty program to the committee. Cineplex needed a loyalty partner because creating their own data system it would have cost them about $5.5 million in the first year. So they looked went on to look for a partner. Flight Miles, having 72 percent of Canadian active members, had the top loyalty program in Canada. Flight Miles can give Cineplex an opportunity to get access to their data bank of seven million customers which would certainly help them in targeting their market. Flight Miles program would cost yearly about $5 million and $0.09 on every point issued to the customer. Flight Miles executives offered Cineplex $250,000 to make the deal more attractive. Scotiabank approached to Cineplex as a potential partner for the loyalty program. It is amongst the top five bank of Canada having 6.8 million customers and 950 branches in Canada. The Scotiabank proposed a 50-50 cost sharing and expected naming rights on three theatres. They offered a three card reward strategy as well. The estimated cost portion of Cineplex w as about $3 million in the first year and $1.7 million and $1.9 million in the later years. Sarah Lewthwaite has now three options to look for loyalty program and had to work on them to finally get the best option. She will have to see the benefits as well as the constraints of the three options. She also restructured the reward program. Sarah performed a sensitivity analysis in the concession revenue per guest which might increase by 5 to 15 per cent and also thought of having a nominal one time or annual membership fee of $2 to $5. Sarah Lewthwaite also knew the fact that only 40 per cent of the points earned by the customer in the loyalty program would be redeemed annually. Lewthwaite then drafted reward structure that contained a preliminary list of four options but she was not sure that which option will click in the customer mind. Loyalty program required a data base vendor who could manage the
Friday, October 4, 2019
English Essay Example | Topics and Well Written Essays - 500 words - 20
English - Essay Example It was made out of two sports,à soccerà (association football) andà rugby football, each of which remains a separate sport with its own specific set of rules. The international body governing it is FIFA Federation International de Football Association and it organized the world cup which is very popular in the world. FIFA governs all levels of soccer, including professional games internationally, Olympic competitions and youth leagues. FIFA world cup is the sportââ¬â¢s premier event, it held after every four years pitting national teams from 32 countries against one another. The most popular and major professional league in the world is the National Football League (NFL). However, over the years, several other leagues have been formed in North America and Europe. While soccer league known as MLS (Major League Soccer) have many teams which have fans worldwide and some of the most famous teams are A.C. Milan of Italy, Ajax Amsterdam of The Netherlands, Manchester United of England, Real Madrid of Spain, Boca Juniors of Argentina, Sà £o Paulo of Brazil, and Colo Colo of Chile etc. The common thing between the two games are that it is a game played between two teams, in which players try to score a goal by hitting in opponents goal, by using any part of the body except the hands. So, players have to use their skills of using feet and heads as they kick, dribble, or pass the ball toward the goal or to another player. While the table shows compare and contrast both the
Thursday, October 3, 2019
Response to Rereading America Essay Example for Free
Response to Rereading America Essay The introduction to the text, Rereading America, challenges us to look beyond the common misconceptions of the ââ¬Å"mythâ⬠the American dream and to think critically. As far as what I think it means to be an American citizen, there are many things to consider. I believe that to be an American can imply freedom, security, and prosperity to some. To others however, it can mean a struggle to survive. The ideal American dream, no matter your interpretation of the idea, seems simply out of reach for most. My idea of what it means to be an American is that you are willing to do work. You are not just waiting for things to simply be handed to you on a silver platter. To me, it also means that you can become successful and accomplish the goals you set your mind to. Even though it is not a complete plan to become successful without having to deal with obstacles, I understand the value of hard work. Being an American for me is saying that I can complete any tasks in my future as long as I stay focused and take it step by step. Without hard work in college I will not be as successful in the work place as I would like to be. I understand that nothing will simply be given to me, but that I have to work hard and earn the things I desire. As far as college goes I will have to maintain a certain grade point average to be accepted in the nursing program. This will require a large amount of effort and I will need to acquire the need to continue on my journey of education, which will help me prepare for the real world. Being an American for me means I am willing to strive to meet the requirements or standards for a college student and a person in the work place. With hard work I will eventually achieve success.
The Transmission Electron Microscopy Biology Essay
The Transmission Electron Microscopy Biology Essay The transmission electron microscope operates on the same basic principles as the light microscope but uses electrons instead of light. What you can see with a light microscope is limited by the wavelength of light. TEMs use electrons as light source and their much lower wavelength make it possible to get a resolution a thousand times better than with a light microscope. TEM uses a technique whereby a beam of electrons is transmitted through an ultra-thin specimen, interacting with the specimen as it passes through. An image is formed from the interaction of the electrons transmitted through the specimen; the image is magnified and focused onto an imaging device, such as a fluorescent screen, on a layer of photographic film, or to be detected by a sensor such as a CCD camera. TEMs are capable of imaging at a significantly higher resolution than light microscopes, owing to the small de Broglie wavelength of electrons. This enables the instruments user to examine fine detail-even as small as a single column of atoms, which is tens of thousands times smaller than the smallest resolvable object in a light microscope. TEM forms a major analysis method in a range of scientific fields, in both physical and biological sciences. TEMs find application in cancer research, virology, materials science as well as pollution, nanotechnology, and semiconductor research. History of TEMs The first operational electron microscope was presented by Ernst Ruska and Max Knoll in 1932, and 6 years later Ruska had a first version on the market. In 1986 Ruska received a Nobel Prize in physics for his fundamental work in electron optics and for the design of the first electron microscope. The following table gives a basic outline of the history of the electron microscope by decades. Year Specimens Application/development Instrumentation/theory Resolution 1940s Replicas oxide carbon plastics surfaces slip steps extracted particles fractography -50kV, single condenser -little or no theory; a first basic theory of electron microscopy was published in 1949 by Heidenreich. ~10nm 1950s Thin foils: from bulk deposited defects phase transitions -100kV -contrast theory developed. ~0.5-2nm 1960s metals semiconductors ceramics minerals Dynamic in-situ studies substructure of solids radiation damage microdiffraction -high voltage electron microscopes (Toulouse: 1.2 and 3MeV) -scanning electron microscopes -accessories for in-situ studies -controlled experiments 0.3nm (transmission) ~15-20nm (scanning) 1970s catalysts quasicrystals High resolution imaging lattice imaging -Analytical transmission electron microscopy -scanning transmission electron microscopy -energy dispersive x-ray spectra -electron energy loss spectroscopy -commercial high voltage electron microscopy (0.4-1.5MeV) -high resolution imaging theory 0.2nm (transmission) 7nm (standard scanning) 1980s virtually all materials atomic resolution in close-packed solids surface imaging small particles -commercial medium-voltage high-resolution/analytical electron microscopy (300-400kV) -improved analytical capabilities -energy filtering imaging -ultra-high vacuum microscopes 0.15nm (transmission) 5nm (scanning at 1kV) 1990s fast computation for image simulation alloy design nanostructures integrated digital scanning and image processing -surface atomic microscopy -orientation imaging microscopy 0.1nm (transmission) 3nm (scanning at 1kV) 2000s Electron microscopy in the 1960s In 1969 RCA dropped out of the electron microscope business, having decided that they could make more money selling record albums and consumer electronic devices. à General Electric had never become a major power in the electron microscope business. This left the field wide open for companies such as JEOL, Hitachi, and Akashi in Japan, and Philips, Siemens, and Zeiss in Europe. The resolution of the best TEMs was now approximately 0.3 nm (3 Ãâ¦); JEOL claimed a resolution of 0.2 nm (2 Ãâ¦) for its 1968 model JEM-100B. Accelerating voltages were still typically in the 100 kV range, although JEOL marketed a 200 kV instrument in 1967 called the JEM-200. Philips marketed a very popular 100 kV microscope called the EM 300 in 1966. They claimed that this was the first fully-transistorized electron microscope, and that it could attain a point resolution of 0.5 nm (5 Ãâ¦). More than 1,850 units of the EM 300 were sold. Another approach to the study of materials that emerged in the 1960s involved increasing the accelerating voltage of the electron gun to extreme levels up to 3 MeV in an effort to penetrate more deeply into thicker samples. CEMES-LOE/CNRS at Toulouse, France, developed a 3MeV instrument around 1965, followed closely by JEOL, which released a 1 MeV microscope, the JEM-1000, in 1966. (One MeV represents a million electron volts, while one kV is a thousand electron volts. So 1,000 kV= 1 MeV.) These ultrahigh voltage EMs were so large that they typically occupied their own two-story building. The electron gun and its associated high voltage electronics were located near the ceiling of the second story, while the operator sat at the bottom of the microscope column looking at the fluorescent screen. Hitachis 1964 model HU-500 stood 4 meters tall; later, higher MeV versions eventually made this look small. On the left is a photograph of the 1 MeV Atomic Resolution Microscope (ARM) at the Lawrence Berkeley Laboratory. Electron microscopy in the 1970s The 1970s were a time of rapid development on all fronts in the electron microscope industry. Further improvements in TEM came from brighter electron sources (lanthanum hexaboride and field emission guns). The resolution of the TEM was pushed to 0.2 nm (2 Ãâ¦) in the 1970s, with better results reported in some cases for lattice imaging resolutions; Hitachi claimed a 1.4 Ã⦠lattice resolution for its 1975 model H-500 TEM, and JEOL claimed the same resolution for its 1973 model JEM-100C. Accelerating voltages of 100 kV maximum had become the norm. In contrast to the low cost instruments, Philips 1972 model EM 301 TEM was designed for high performance and versatility for the skilled operator who had the time to coax the best results from his instrument. The EM 400 introduced in 1975 used a LAB6 electron gun, which was ten times as bright as the standard tungsten filament at the time. On the down side, the reactivity of lanthanum hexaboride required an ultra-clean vacuum system of 10-6 Torr. In 1977 Philips introduced accessories for the EM 400, including a secondary electron detector for topographical studies and a field emission gun (FEG) a single crystal tungsten tipped filament that emits electrons from a very localized region of the tip to produce narrow, bright electron beams. FEGs can have100 to 1,000 times the brightness of a LAB6 filament, with electron beam diameters as small as 1 nm. Vacuum requirements for these FEGs are 10-10 Torr. JEOL started with the JEM-100B Analytical model in 1970, which added scanning ability and an EDX x-ray spectrometer to the TEM. This was improved upon by the JEM-100C in 1973, with its 1.4 Ã⦠resolution, and further upgraded by the JEM-100CX Analytical model in 1976, which added an ultraclean vacuum system and a LAB6 electron gun. In the ultrahigh voltage EM market, The Hitachi 3MeV HU-3000 was installed at Osaka University in 1970. This accelerating voltage was the highest ever for an electron microscope. A resolution of 4.6 Ã⦠was reported for this instrument. The 1976 model H-1250 had a maximum voltage of 1250 kV, but a superior resolution of 2.04 Ãâ¦. Electron microscopy in the 1980s During the 1980s TEM resolutions were further reduced to 1.0 to 1.5Ãâ¦, making imaging of atoms in lattice planes possible. Microprocessor control of microscopes and computerized analysis of data became common due to the emergence of the personal computer in the early 80s. This microprocessor control brought about such features as an auto-stigmator and auto-focus, freeing the microscope operator from the mundane tasks that had always been involved in using the instrument. Electron energy loss spectroscopy (EELS) detectors were incorporated in STEMs and AEMs, allowing detection of low atomic number elements that could not be seen using x-ray techniques. The demands of the fast-growing integrated circuits industry produced electron microscopes designed for non-destructive testing of semiconductor wafers and for functional testing of ICs. Smaller electron beam sizes made it possible to switch from microprobe to nanoprobe technology. Elemental mapping of a samples surface could now be done on a nanometer level. Development of low cost instruments was not a priority in the 1980s. Some that were developed in the 1970s continued to be sold, but development was focused on high-performance, high-resolution, microprocessor-controlled instruments. JEOL produced 7 new TEM units between 1980 and 1986. These included the JEM-1200 EX (1981), which added microprocessor control to the JEM-100 CX (1976). The same model equipped with an EDS x-ray spectrometer was called the JEM-1200 EX/Analytical microscope. The 1984 model JEM-2000 FX/Analytical had a maximum voltage of 200 kV and a resolution of 2.8 Ãâ¦; this instrument marked the switch from a microprobe beam to a nanoprobe. The JEM-4000 FX/Analytical microscope introduced in 1986 raised the acceleration voltage to 400 kV, which produced a beam probe size only 2 nm in diameter. After years of a standard 100 kV accelerating voltage with a few ultrahigh voltage units thrown in, these medium-voltage microscopes finally became popular. Electron microscopy in the 1990s The 1990s produced several corporate mergers in the electron microscope industry. Carl Zeiss and Leica joined to form LEO Electron Microscopy, Inc. In 1996 Philips bought Electroscan, the developer of the environmental SEM in the 1980s, to form Philips Electroscan. The following year Philips Electron Optics and a company called FEI merged under the name FEI to continue manufacturing electron microscopes. Hitachi and JEOL remained independent entities. The resolution of TEMs had already reached its theoretical limit (the best possible resolution predicted by calculations), so the 1Ã⦠resolution obtained using field emission gun (FEG) electron sources remained the standard. Medium voltage range instruments up to 300 kV were common, although 100 kV instruments still kept their long lasting popularity. Computers were now a vital part of every electron microscope, with graphical user interfaces (GUIs) being the norm. They were involved in both the control of the instrument and the processing of data, including post-analysis enhancement of micrographs using contrast-enhancing software. JEOL offered TEMs with maximum accelerating voltages of 120, 200, and 300 kV. The 120 kV model JEM1230 had a resolution of 0.2 nm (2Ãâ¦). The JEM-2010 F FasTEM (200 kV) and the JEM-3000 F FasTEM (300 kV) both used FEG sources and achieved resolutions of 0.1 nm (1.0 Ãâ¦). Three meetings of the Electron Microscopy Society of America (1968, 1975, and 1980) The Electron Microscopy Society of America (now known as the Microscopy Society of America) was founded in 1942, when it began holding annual meetings for instrument makers and users to gather and discuss the technology and its applications. The topics of papers given at these meetings present a snapshot of the state of electron microscopy at the time. A brief look at three of these meetings shows the evolution of the technology and its applications over a 12-year period. In the brief twelve-year span of 1968 to 1980, the physical sciences overtook the biological sciences at EMSA meetings, judging solely on number of papers presented. A large part of this development is probably due to the emergence of the scanning electron microscope in 1965, which made examination of the surface of bulk specimens possible for the first time. Since physical scientists could now look at real samples instead of replicas or thin films, activity in microscopy of materials increased dramatically. With no similar dramatic development in biological microscopy, the balance shifted. The Science of TEMs Comparison of Light (LM) and Electron Microscopes. a. Similarities 1) Illumination system: produces required radiation and directs it onto the specimen. Consists of a source, which emits the radiation, and a condenser lens, which focuses the illuminating beam (allowing variations of intensity to be made) on the specimen. 2) Specimen stage: situated between the illumination and imaging systems. 3) Imaging system: Lenses which together produce the final magnified image of the specimen. Consists of i) an objective lens which focuses the beam after it passes through the specimen and forms an intermediate image of the specimen and ii) the projector lens(es) which magnifies a portion of the intermediate image to form the final image. 4) Image recording system: Converts the radiation into a permanent image (typically on a photographic emulsion) that can be viewed. b. Differences 1) Optical lenses are generally made of glass with fixed focal lengths whereas magnetic lenses are constructed with ferromagnetic materials and windings of copper wire producing a focal length which can be changed by varying the current through the coil. 2) Magnification in the LM is generally changed by switching between different power objective lenses mounted on a rotating turret above the specimen. It can also be changed if oculars (eyepieces) of different power are used. In the TEM the magnification (focal length) of the objective remains fixed while the focal length of the projector lens is changed to vary magnification. 3) The LM has a small depth of field, thus different focal levels can be seen in the specimen. The large (relative) depth of field in the TEM means that the entire (thin) specimen is in focus simultaneously. 4) Mechanisms of image formation vary (phase and amplitude contrast). 5) TEMs are generally constructed with the radiation source at the top of the instrument: the source is generally situated at the bottom of LMs. 6) TEM is operated at high vacuum (since the mean free path of electrons in air is very small) so most specimens (biological) must be dehydrated. 7) TEM specimens (biological) are rapidly damaged by the electron beam. 8) TEMs can achieve higher magnification and better resolution than LMs. 9) Price tag!!! (100x more than LM) Figure below shows the cross-sectional view of a standard TEM. Figure shows the transmission electron microscope at The Chinese University of Hong Kong. Figure shows a schematic outline of a TEM. A TEM contains four parts: electron source, electromagnetic lens system, sample holder, and imaging system. A. Electron Source The electron gun produces a beam of electrons whose kinetic energy is high enough to enable them to pass through thin areas of the TEM specimen. The gun consists of an electron source, also known as the cathode because it is at a high negative potential, and an electron-accelerating chamber. There are several types of electron source, operating on different physical principles, which we now discuss. i. Thermionic Emission Figure 3-1 shows a common form of electron gun. The electron source is a V-shaped (hairpin) filament made of tungsten (W) wire, spot-welded to straight-wire leads that are mounted in a ceramic or glass socket, allowing the filament assembly to be exchanged easily when the filament eventually burns out. A direct (dc) current heats the filament to about 2700 K, at which temperature tungsten emits electrons into the surrounding vacuum by the process known as thermionic emission. Figure 3-1. Thermionic electron gun containing a tungsten filament F, Wehnelt electrode W, ceramic high-voltage insulator C, and o-ring seal O to the lower part of the TEM column. An autobias resistor, RB (actually located inside the high-voltage generator, as in Fig. 3-6) is used to generate a potential difference between W and F; thereby controlling the electron-emission current, Ie. Arrows denote the direction of electron flow that gives rise to the emission current. Raising the temperature of the cathode causes the nuclei of its atoms to vibrate with increased amplitude. Because the conduction electrons are in thermodynamic equilibrium with the atoms, they share this thermal energy, and a small proportion of them achieve energies above the vacuum level, enabling them to escape across the metal/vacuum interface. The rate of electron emission can be represented as a current density Je(in A/m2) at the cathode surface, which is given by the Richardson law: Where T is the absolute temperature (in K) of the cathode and A is the Richardson constant (~106Am-2K-2), which depends to some degree on the cathode material but not on its temperature; k is the Boltzmann constant (1.38 x 10-23J/K), and kT is approximately the mean thermal energy of an atom. ii. Schottky emission The thermionic emission of electrons can be increased by applying an electrostatic field to the cathode surface. This field lowers the height of the potential barrier (which keeps electrons inside the cathode) by an amount, the so-called Schottky effect. A Schottky source consists of a pointed crystal of tungsten welded to the end of V-shaped tungsten filament. The tip is coated with zirconium oxide (ZrO) to provide a low work function (~2.8 eV) and needs to be heated to only about 1800 K to provide adequate electron emission. Because the tip is very sharp, electrons are emitted from a very small area, resulting in a relatively high current density ( Je ~ 107A/m2) at the surface. Because the ZrO is easily poisoned by ambient gases, the Schottky source requires a vacuum substantially better than that of a LaB6 source. iii. Field emission If the electrostatic field at a tip of a cathode is increased sufficiently, the width (horizontal in Fig.3-4) of the potential barrier becomes small enough to allow electrons to escape through the surface potential barrier by quantum-mechanical tunneling, a process known as field emission. The probability of electron tunneling becomes high when the barrier width, w is comparable to de Broglie wavelength of the electron. This wavelength is related to the electron momentum p by p=h/ÃŽà » where h= 6.63 x 10-34 Js is the Planck constant. Because the barrier width is smallest for electrons at the top of the conduction band, they are the ones most likely to escape. Because thermal excitation is not required, a field-emission tip can operate at room temperature, and the process is sometimes called cold field emission. As there is no evaporation of tungsten during normal operation, the tip can last for many months or even years before replacement. It is heated (flashed) from time to time to remove adsorbed gases, which affect the work function and cause the emission current to be unstable. Even so, cold field emission requires ultra-high vacuum (UHV: pressure ~ 10-8 Pa) to achieve stable operation, requiring an elaborate vacuum system and resulting in substantially greater cost of the instrument. B. Electromagnetic Lens System The TEM may be required to produce a highly magnified (e.g, M = 105) image of a specimen on a fluorescent screen, of diameter typically 15 cm. To ensure that the screen image is not too dim, most of the electrons that pass through the specimen should fall within this diameter, which is equivalent to a diameter of (15 cm)/M = 1.5 à µm at the specimen. For viewing larger areas of specimen, however, the final-image magnification might need to be as low as 2000, requiring an illumination diameter of 75 à µm at the specimen. In order to achieve the required flexibility, the condenser-lens system must contain at least two electron lenses. The first condenser (C1) lens is a strong magnetic lens, with a focal length f that may be as small as 2 mm. Using the virtual electron source(diameter ds) as its object, C1 produces areal image of diameter d1. Because the lens is located 20 cm or more below the object, the object distance, u ~ 20 cm >> f and so the image distance v ~ f. The second condenser (C2) lens is a weak magnetic lens ( f ~ several centimeters) that provides little or no magnification (M ~ 1) but allows the diameter of illumination (d) at the specimen to be varied continuously over a wide range. The C2 lens also contains the condenser aperture (the hole in the condenser diaphragm) whose diameter D can be changed in order to control the convergence semi-angle of the illumination, the maximum angle by which the incident electrons deviate from the optic axis. Figure shows lens action within the accelerating field of an electron gun, between the electron source and the anode. Curvature of the equipotential surfaces around the hole in the Wehnelt electrode constitutes a converging electrostatic lens (equivalent to a convex lens in light optics), whereas the non-uniform field just above the aperture in the anode creates a diverging lens (the equivalent of a concave lens in light optics). C. Sample Holder To allow observation in different brands or models of microscope, TEM specimens are always made circular with a diameter of 3 mm. Perpendicular to this disk, the specimen must be thin enough (at least in some regions) to allow electrons to be transmitted to form the magnified image. The specimen stage is designed to hold the specimen as stationary as possible, as any drift or vibration would be magnified in the final image, impairing its spatial resolution (especially if the image is recorded by a camera over a period of several seconds). But in order to view all possible regions of the specimen, it is also necessary to move the specimen horizontally over a distance of up to3 mm if necessary. The design of the stage must also allow the specimen to be inserted into the vacuum of the TEM column without introducing air. This is achieved by inserting the specimen through an airlock, a small chamber into which the specimen is placed initially and which can be evacuated before the specimen enters the TEM column. Not surprisingly, the specimen stage and airlock are the most mechanically complex and precision-machined parts of the TEM. There are two basic designs of the specimen stage: side-entry and top-entry. In a side-entry stage, the specimen is clamped (for example, by a threaded ring) close to the end of a rod-shaped specimen holder and is inserted horizontally through the airlock. The airlock-evacuation valve and a high-vacuum valve (at the entrance to the TEM column) are activated by rotation of the specimen holder about its long axis; see figure (a). One advantage of this side-entry design is that it is easy to arrange for precision motion of the specimen. Translation in the horizontal plane (x and y directions) and in the vertical (z) direction is often achieved by applying the appropriate movement to an end-stop that makes contact with the pointed end of the specimen holder. A further advantage of the side-entry stage is that heating of a specimen is easy to arrange, by installing a small heater at the end of the specimen holder, with electrical leads running along the inside of the holder to a power supply located outside the TEM. The ability to change the temperature of a specimen allows structural changes in a material (such as phase transitions)to be studied at the microscopic level. Specimen cooling can also be achieved, by incorporating (inside the side-entry holder) a heat-conducting metal rod whose outer end is immersed in liquid nitrogen (at 77 K). One disadvantage of the side-entry design is that mechanical vibrationà picked up from the TEM column or from acoustical vibrations in the external air, is transmitted directly to the specimen. In addition, any thermal expansion of the specimen holder can cause drift of the specimen and of the TEM image. These problems have been largely overcome by careful design, including choice of materials used to construct the specimen holder. As a result, side-entry holders are widely used, even for high-resolution imaging. In a top-entry stage, the specimen is clamped to the bottom end of a cylindrical holder that is equipped with a conical collar; see Figure (b). The holder is loaded into position through an airlock by means of a sliding and tilting arm, which is then detached and retracted. Inside the TEM, the cone of the specimen holder fits snugly into a conical well of the specimen stage, which can be translated in the (x and y) horizontal directions by a precision gear mechanism. The major advantage of a top-entry design is that the loading arm is disengaged after the specimen is loaded, so the specimen holder is less liable to pick up vibrations from the TEM environment. In addition, its axially symmetric design tends to ensure that any thermal expansion occurs radially about the optic axis and therefore becomes small close to the axis. However, in disadvantage views, it is more difficult to provide tilting, heating, or cooling of the specimen. Although such facilities have all been implemented in top-entry stages, they require elaborate precision engineering, making the holder fragile and expensive. Because the specimen is held at the bottom of its holder, it is difficult to collect more than a small fraction of the x-rays that are generatedà by the transmitted beam and emitted in the upward direction, making this design less attractive for high-sensitivity elemental analysis. D. Imaging System The sample is placed in front of the objective lens in a form of thin foil, thin section or fine particles transparent for the electron beam. (Figure. 3). The objective lens forms an image of the electron density distribution at the exit surface of the specimen based on the electron optical principles. The diffraction, projection and intermediate lenses below the objective lens are used to focus and magnify either the diffraction pattern or the image onto a fluorescent screen, which converts the electrons into visible light signal. There are three important mechanisms, which produce image contrast in the electron microscope: mass-thickness contrast, phase contrast and diffraction or amplitude contrast. i. Mass-thickness contrast arises from incoherent elastic scattering of electrons. As electrons go through the specimen they are scattered off axis by elastic nuclear interaction also called Rutherford scattering. The cross section for elastic scattering is a function of the atomic number (Z). As the thickness of the specimen increases the elastic scattering also increases since the mean-free path remains fixed. Also specimens consisting of higher Z elements will scatter more electrons than low-Z specimens. This will create differential intensity in an image formed from thicker regions where fewer electrons will be transmitted to the image compared to a thinner or low atomic number region, which will be brighter in the image plane. In TEM, the mass-thickness contrast is affected by the size of the objective aperture and the accelerating voltage. Smaller apertures will increase the difference in the ratio of scattered and transmitted electrons and as a consequence will increase the contrast between regions of different thickness of mass. Lowering the accelerating voltage will lead to similar effect since the scattering angle and the cross section increase which also will cause increase in the relative contrast between higher mass and lower mass regions. ii. Phase contrast. Some of the electrons leaving the specimen are recombined to form the image so that phase differences present at the exit surface of the specimen are converted into intensity differences in the image. Phase contrast is the dominant mechanism for object detail iii. Diffraction contrast. Diffracted electrons leaving the lower surface of a crystalline specimen are intercepted by the objective aperture and prevented from contributing to the image. Alternatively only one diffracted beam forms the image. Diffraction contrast is the dominant mechanism delineating object detail >15 Ã⦠in crystalline specimens and is important and widely used contrast mechanism for study of crystal defects. Using this approach considerable quantitative information about the defect structure of the specimen may be obtained without operating the microscope at maximum resolution. Vacuum System Electron microscopes cannot operate in air for a number of reasons. The penetration of electrons through air is typically no more than 1 meter, so after coming on meter from the gun, the whole beam would be lost to collisions of the electrons with the air molecules. It is also not possible to generate the high charge difference between the anode and cathode in the gun because air is not a perfect insulator. Finally, the beam on the specimen while in air would trap all sorts of rubbish (air is full of hydrocarbon molecules) on the specimen, crack them (removing hydrogen, oxygen, etc.) and thus leave a thick carbon contamination layer on the specimen. Each electron microscope therefore has a vacuum system. The degree of sophistication of the vacuum system depends on the requirements. Simple imaging of biological thin sections is much less demanding than cryo applications or small-probe analysis in materials science and a thermionic gun can operate under much worse vacuum than a Field E mission Gun (FEG). The most basic vacuum system consists of a vessel connected to a pump that removes the air. The vacuum system of an electron microscope is considerably more complicated, containing a number of vessels, pumps, valves (to separate different vessels) and gauges (to measure vacuum pressures). From the bottom up we can distinguish four vessels in the vacuum system: The buffer tank The projection chamber The column (specimen area) The electron gun area Sometimes a tubomolecular pump (TMP), essentially a high-speed turbine fan, is used in place of (or to supplement) a diffusion pump. Usually an ion pump is used to achieve pressures below 10-4Pa, as required to operate a LaB6, Schottky, or field-emission electron source. By applying a potential difference of several kilovolts between large electrodes, a low-pressure discharge is set up (aided by the presence of a magnetic field) which removes gas molecules by burying them in one of the electrodes. Figure shows cross section through a diffusion pump. The arrows show oil vapor leaving jets within the baffle assembly. Water flowing within a coiled metal tube keeps the walls cool. Frequently, liquid nitrogen is used to help in achieving adequate vacuum inside the TEM, through a process known as cryo
Wednesday, October 2, 2019
The Benefits of Prolonging and Separating Vaccines Essay examples -- V
Parents today have many concerns for the well being of their child. One big apprehension is what vaccines are being introduced into their infantââ¬â¢s small bodies and the many adverse reactions they cause. In our current generation, infants are injected with up to 31 vaccines just in their first year of life (CDC, 2015). Life threatening diseases are prevented with such vaccines, but parents are often left to wonder, how many of these vaccines are even necessary. Many of the vaccines are given in combinations; sometimes three or more disease fighting vaccines are given in one inoculation. There is continued clinical research to increase efficiency of these vaccines, changing the components of the vaccines, making them vastly different from what they were in generations past. Separating these vaccines can make a large difference in how a childââ¬â¢s body reacts to the new generation of vaccines. Vaccines can not only be separated out of the combinations that are most com mon, they can be prolonged. Some parents choose to give vaccines only every six months, others choose to wait to start any vaccinations until their child is two years of age (Miller, 2014). Parents have choices today, follow the recommended dosage schedule or prolong and separate their childââ¬â¢s vaccines. In doing the latter, and infantââ¬â¢s body systems have time to mature, side effects may be reduced, and parents will be more willing to vaccinate.Ã¢â¬Æ' Prolonging and Separating Infant Vaccinations There are many reason parents choose to vaccinate or not. Side effects and fears of permanent adverse reactions are among the biggest of parent fears when considering when and how to vaccinate their child. With the emergence of fears of autism, neurological problems, develop... ...ines/multiplevaccines.html Fisher, Barbara. (2011). Vaccine Safety: Evaluating the Science. Medical Science and Public Trust: The Policy, Ethics and Law of Vaccination in the 20th and 21st Century. Retrieved from http://www.nvic.org/getdoc/6cd24653-fd19-49e5-842a-3917e15de533/Medical-Science---Public-Trust.aspx#top Dr. Miller, Donald. (2014). LewRockwell.com. A User-Friendly Vaccination Schedule. Retrieved from http://www.lewrockwell.com/miller/miller15.html Morales, Tatiana. (2014). CBSNews. To Vaccinate or Not. Retrieved from http://www.cbsnews.com/stories/2014/12/04/earlyshow/contributors/emilysenay/main531638.shtml Stratton, Kathleen, Wilson, Christopher & McCormick, Marie. (2002). Under Review: Multiple Immunizations and Immune Dysfunction. Immunization Safety Review. (Pgs. 32-42). Retrieved from http://www.nap.edu/openbook.php?record_id=10306&page=32
Tuesday, October 1, 2019
Ma Rainey and Bessie Smith: Two Legendary Classical Blues Artists Essay
The blues emerged as a distinct African-American musical form in the early twentieth century. It typically employed a twelve-bar framework and three-lined stanzas; its roots are based in early African-American songs, such as field hollers and work songs, and generally have a melancholy mood. The blues can be divided into many sub-genres, including Classical, Country, and Urban. The purpose of this paper is to focus on the careers of two of Classical blues most influential and legendary singers: Ma Rainey and Bessie Smith. Ma Rainey, considered by many to be the ââ¬Å"Mother of the Blues,â⬠was one of the first pioneers of the classical blues style. She sang with a deep, rich, and quite often rough contralto voice while the voices of her contemporaries a generation later were more harmonious. Rainey was an important figure in connecting the Classical blues, largely female dominated, with the predominately male Country blues.1 Born Gertrude Pridgett in Georgia in 1886 to parents who had both performed in the minstrel shows, she was exposed to music at a very early age. At the age of fourteen, she performed in a local talent show called ââ¬Å"The Bunch of Blackberries,â⬠and by 1900 she was regularly singing in public.2 Over the next couple of decades, she worked in a variety of traveling minstrel shows, including Tolliver's Circus and Musical Extravaganza, and the Rabbit Foot Minstrels; she was one of the first women to incorporate the blues into minstrelsy. It was while working with the Rabbit Foot Minstrels that she met William Rainey, whom she married in 1904; together, they toured as ââ¬Å"Ma and Pa Rainey: Assassinators of the Blues.â⬠By the early 1920s, she was a star of the Theater Owners' Booking Agency (TOBA), which were white-... ...line of Smith's career ââ¬â and in Classical blues, in general ââ¬â was due to changing trends in music. Classical blues was out, and Swing was now the music of choice. Smith, however, was determined to make a comeback. She began performing again, this time labeling herself as a Swing singer. But before she could re-establish herself as a household name, she passed away from injuries caused by an automobile accident. It was not until some years after her death that her music began to be popularized again. Her recordings with Armstrong became popular among jazz musicians and had great influence on singers such as Billie Holiday, who often listened to Smith's records for inspiration. Frank Sinatra held her in high esteem, and Janis Joplin often emulated Smith's voice in her singing. Bessie Smith was inducted into the Rock and Roll Hall of Fame in 1939.
The General Understanding of Technology
ââ¬Å"Technology is not an image of the world but a way of operating on reality. The nihilism of technology lies not only in the fact that it is the most perfect expression of the will to power â⬠¦ but also in the fact that it lacks meaning. â⬠(Octavio Paz) Technology is the general term for the processes by which human beings create tools and machines to increase their control and understanding of the material environment. It is perhaps best understood in a historical context that traces the evolution of early humans from a period of very simple tools to the complex, large-scale networks that influence much of our modern-day life. For the past couple of decades, it has been unclear, whether technology is a positive movement or a path to self-destruction. The debate has led strong arguments from both sides, but the one thing that they both agree on is that technology involves a huge risk. However, the movement toward a technological workplace has been undoubtedly in the works for a long time and no matter what the critics say it will still continue to grow exponentially each year. As the world stumbles toward the twenty-first century, a shadow looms over the planet, a dawn of a new revolution: a revolution of work. Just as human history was forced to cope with the transformations that came with the rise of the Industrial Revolution, we now must deal with the end of that Revolution and the beginning of another. Although this technological revolution in the business world has been the subject of immense media hype and scrutiny in the past few years, it has occurred slowly but surely over the past few decades. The revolution reaches as far back as the invention of the telegraph in the 1850s. The invention of the telephone, fax machine, and more recent developments in wireless communication have offered businesses more flexibility and efficiency, and those willing to embrace these new technologies have found that they are more likely to survive and prosper than fade away as fads. As a result, employers persistently push for technological advancements regardless of the risks. Rumors about computers taking over peopleâ⬠s jobs run rampant through todayâ⬠s high-speed network of communication. The fear of losing oneâ⬠s job to a hard-cased metallic box is beyond anyoneâ⬠s understanding. However strong of a possibility it may be, the technology age is far from it. As Nobel Peace Laureate Arno Penzias, chief scientist at Lucent Bell Labs, said ââ¬Å"â⬠¦ I can't say anything is totally impossibleââ¬âof a computer, no matter how powerful, replacing a human being. Human beings just do too many different things. â⬠Technology still requires human interaction. For example, at a super-market, if the clerk scans a product over the bar code reader and the reader is unable to read the product correctly, the clerk must manually enter the number into the register. Arno goes on to reiterate that ââ¬Å"Technology is a tool and it can make us whatever we are already, only more so. Todayâ⬠s technology is in no state to replace humans, but rather is in a state requiring integration of human intuition and machine logic. The result is today's heavily technological workplace, where proficiency with complex phone systems, fax machines, and networked computers is essential. These machines tend not only to liberate but also enslave the common worker. Critics argue that technology can be a positive influence, but with the current situation in which new technology grows each day, it is making more of a negative impact and generating additional hardship for the worker. A report by the Information Technology Association of America (ITAA) warns that one out of every 10 jobs requiring information technology skills is going unfilled due to a shortage of qualified workers. â⬠Critics claim that workers are unable to keep up with the speed at which technology is being unveiled and that employers are blinded by the ââ¬Å"infinite possibilitiesâ⬠that technology promises. ââ¬Å"It's like running out of iron ore in the middle of the Industrial Revolution,â⬠says the association's (ITAA) president. A study says that an estimated 60% of new jobs in the year 2000 will require skills possessed by only 22% of new workers, thus requiring U. S. companies to send more of their work overseas where they can find eligible job candidates. Technology is a positive movement; however, it plays a key role in many cases of unemployment. As the rate of technological development quickens, those who do not work with these advancements on a day-to-day basis can become detached from the modern industry and consumer demands, thus becoming far less useful to a company. For example, a young employee at a bank in the past could become increasingly useful and valuable to his company as he aged, since his knowledge would be cumulative of all that he had experienced, since the industry would probably not undergo drastic changes in fifty years. Today, however, a 50-year-old manager of a computer firm would have started his career when punch cards were used to collect and store data in programs. For him to keep up with the astounding changes in the computer industry over the past 30 years would be a commendable achievement by itself, let alone running a company at the same time. However, despite the prosperity that technology may bring, the current trend of hardships in a technological workplace has deterred many young workers. A (ITAA) survey showed that 2,000 large and mid-sized companies found at least 190,000 unfilled information technology (IT) jobs. The report cited a decline in college graduates with degrees in mathematics or computer science. Currently, ââ¬Å"With the median age at 40 and climbing, middle-aged and older workers will be the cores of tomorrow's workforce (while younger workers will be scarce)â⬠¦ To compete for the best workers, businesses will offer expanded employee benefits and flex scheduling to accommodate the needs of diverse ages and lifestyles. â⬠These benefits that businesses promise to accommodate their workers with are beginning to appeal more and more in the eyes of younger workers and college grads. According to a study by Newsweek, traces of technological growth are already evident. The top three fastest growing and top paying jobs involve or directly use technology; Database manager at 11. %, Computer engineer at 10. 9%, and Systemâ⬠s Analyst at 10. 3%. The introduction of technology into the workplace sometimes poses difficult challenges for supervisors, and often the manager-employee relationship. Although a worker's access to a phone or computer may theoretically increase his or her productivity, it also introduces new temptations for distraction and wasted time. In addition, employees become more isolated and their relationships with co-workers deteriorate. Client contacts can frequently be handled over the phone or by other electronic means, and although this usually proves more efficient and cost-effective than traditional person-to-person contact, it also results in a depersonalization of this relationship. Technological advancements also sometimes lead to divisions within a company between management and its employees. Management must decide to give workers the freedom associated with many of these technologies and construct a plan for monitoring employees use of these technologies, while keeping in mind that overbearing supervision leads to worker dissatisfaction and distrust of managers. In general, the relationships between individuals of any level of a company tend to suffer with the introduction of new technological methods. In summary, technology has changed our workplaces enormously. It has not only opened up opportunities, but has also changed the very nature of work. In the transformation from an agricultural to an industrial based economy the world has redefined work. Labor meant the men, women, and children in factories. However, those jobs are no longer there. The majority of people are no longer needed for the production of goods in the world with the advent of more modern mechanized production facilities. These trends foreshadow not just change but a seismic quake. A wave of change that will crash upon us with a force we havenâ⬠t known before. Many will see this new wave of change as frightening. But, it does not have to be viewed that way. Aside from all the loss and danger our collective future shows, it also offers unparalleled opportunity.
Subscribe to:
Posts (Atom)