When we talk about air stripping, we are referring to the process of removing VOCS or volatile organic chemicals from wastewater using a form of aeration treatment. Volatile organic chemicals are dangerous manmade chemicals that are released from solids and liquids in the form of a gas.
They include chlorinated solvents and chemicals normally found in fuel and other petroleum products. They are also seen in paint thinners, hydraulic fluids and dry-cleaning agents. Contamination of drinking water with volatile organic chemicals is a cause of major concern due to the presence of many carcinogens found in them.
A lot of volatile organic chemicals can be found in industrial wastewater and groundwater supplies and to some extent even surface water. Contact with any of these chemicals result in CNS, skin, liver and kidney problems.
Using an air stripper, these volatile organic compounds are effectively removed from both waste and groundwater sources. Here is how it works, liquids usually wastewater is brought into contact with a gas, in this case air for the purpose of removing the undesirable agent present in the liquid phase so that it can be safely disposed of.
How effective is Air Stripping?
This is determined by the air-to-water ratio. In this case, the treatment effectiveness is increased as the air-to-water ratio increases. Stripping towers are designed to have large air capacity for better treatment results. Also, the kind of contaminant that requires removal and it concentration in the wastewater also has a major factor in terms of effectiveness. Other factors like, water temperature and the presence of other contaminants such as dissolved solids also play an important role. Once all of the chemicals have evaporated they are then removed safely. Some of the chemicals simply dissipate when exposed to sunlight while other gases may require further treatment in order to get rid of safely.
Perfluorinated compounds are man-made chemicals that feature eight carbons and fluorine. They function mainly to make things less sticky. They are the primary ingredient in a number of products such as stain-resistant carpets, furniture and fabrics. They can also be seen on the inside of food packaging and is one of the main chemicals found in non-stick pans.
According to granular activated carbon filters experts the fact that these chemicals are manmade means that it only becomes a problem if you are near or in one of the affected areas where these were manufactured or used such as a wastewater treatment facility, furniture factory, textile factory and the like. Any place where this has seeped into the ground or possibly present in the water supply.
Unfortunately, the EPA has not yet established an official maximum contaminant level for the Safe Drinking Water Act. They have however, put out a health advisory limit of 70 parts per trillion. These compounds have voluntarily been phased out of production since 2002 but there is some still remaining where there is no acceptable alternative for its usage.
There are actually three accepted treatment methods used for eliminating perfluorinated compounds and these are: reverse osmosis, ion exchange resin and finally granular activated carbon.
In reverse osmosis, commonly used to remove volatile organic chemicals from water you are able to generate two water streams namely a permeate stream which is good water and a concentrate stream or wastewater.
Granular activated carbon adsorbs the chemicals into the carbon and keeps it there. Basically, water is run through it and the chemicals stay there.
For ion exchange resin the same adsorption process is utilized.
Reverse osmosis will require some substantial electricity costs as you will need to place a pump that will provide the necessary pressure for the system. There are also tanks and re-pressurization pumps for the permeate. Many municipal water treatment facilities use the granular activated carbon treatment and as long as there is enough empty bed contact time, which may be achieved by, changing spent activated carbon or putting in additional vessels. Since it costs less than reverse osmosis, a lot of municipalities are using it.
In any industry, water is a vital part of the system. From energy production to food and beverage it plays a crucial role. However, if you are not aware about the ins and outs of water treatment, you will need the services of a water service provider. Here are some tips to help you choose the right one.
Safety and Environmental Compliance
According to environmental remediation equipment providers, safety and environmental compliance are two of the biggest factors that concern customers. Companies need to be assured that people that come in to work be able to leave in the same way at the end of the day. Moreover, it is a no-brainer that companies require providers to be conscientious of the environment. Another aspect that people look up to is value. This means innovative ideas and the right solutions that can offer good pricing over the lifespan of the water treatment system.
Companies should look for providers with an impeccable safety record and it should be seen in the work culture as well. They need to expect any water treatment service provider that they invite to their facility to adhere to the rules and regulations that guide their business. How can one know if a company has a strong safety culture? Look at the metrics they use for their workers and their projects. This will show how they respect rules and safety.
If you are going to handle water decontamination like VOC removal, you need to have the expertise to be able to do it correctly and up to par with industry standards. Look for a provider that has the highest levels of competency, communication and dedication for their clients. This comes in the form of knowledgeable sales reps supported by deep application engineering, competent project management and manufacturing knowledge. Customers should expect the provider to have a handle on the customer’s business more importantly the kind of water system that business entails.
In many on-site wastewater treatment systems, UV disinfections have been accepted as a standard feature. Moreover, many in the drinking water industry have also accepted it as an effective means of combating chlorine resistant organisms like Giardia and Cryptosporidium. It is also popular in wastewater treatment because it does not use any chemicals and it also negates the need for dechlorination as well as its inherent trait to disinfect anything it comes into contact with.
How It Works
According to activated carbon adsoption experts, UV light disinfects by causing permanent damage to the DNA of any living species. Once the DNA is destroyed, the organism is no longer able to function and dies. How can this effectiveness be measured? The introduction of UV system validation using bioassay methods shows excellent evidence of its effectiveness.
In order to perform a bioassay, pathogenic organisms are introduced into the fluid stream before the UV system. The operation is done under very controlled conditions and some system variables are measured like: flow, transmittance, power loads and lamp intensity before and after the procedure.
Once all the data has been collected, it is then sent to the analyzing laboratory so as to compare to the estimation of the manufacturer. It goes without saying that these bioassays need to be carried out under the supervision of a credible third party.
Virtually all-ultraviolet systems have to del with differences in flow rates and normally an operating flow range is necessary when conceptualizing an ultraviolet system. It is therefore vital to install a UV system for wastewater disinfection into a closed pipe setup so as to ensure not only optimal hydraulics but more importantly to prevent anyone from operating the system to be exposed to the UV light and the wastewater.
Chlorination is no longer the only solution when disinfecting wastewater. UV systems are now in the forefront of wastewater treatment not only for its effectiveness but more importantly its non-chemical nature.
There are a number of challenges that cofferdam and dewatering projects pose during the construction process. According to construction dewatering experts, miscalculation of what is below the surface such as depth and flow, groundwater infiltration and degree of turbidity may result in costly operating expenses. Here are some important tips to consider.
Depth and Flow
According to groundwater treatment system experts, the depth and flow of conditions of rivers fluctuate depending on the season so it is vital to review the timing of the project. For example, water levels are usually lower in the Northeast in August compared to April after heavy rain or snow melting. It is always a good idea to check USGS gauges in order to monitor differences in river levels and speeds of flow so as to best gauge the most ideal timetable for a project.
It is vital for any contractor to know the limits of the permits they are going to get, as this will have a direct effect on the dewatering project. Permits for turbidity, contamination, right of way and time constraints for working in the water are the responsibility of the project engineer usually but it will not hurt if the contractor also is aware of these as well.
Navigating the Subsurface
It is vital that contractors are aware of the silt levels, vegetation, slopes and the composition of the subsurface because all of these will have an effect on the kind of cofferdam method that will best be used for the project. For instance, bedrock is not a good area to drive sheet piling while too much vegetation could establish waterways or channels beneath the subsurface that could potentially leech into the dry work area. Having a good knowledge of the subsurface terrain as well as the elevations will play a crucial role in choosing the best cofferdam method.
Engineering and Design Criteria
The contractor must be able to know which dewatering solution or technology is best used for a certain project and if it is also ideal from a cost and viability point of view. He should have intimate knowledge of dam safety protocols and other procedures to be able to provide adequate engineering support should there be an emergency.
When we talk about nanotechnology we are referring to the ability to manipulate matter at the atomic and molecular level for the purpose of creating something useful at the nano-dimensional level. To do this there are two approaches being considered and they are the top-down and the bottom-up. The former calls for the same methods used in MEMS but are made smaller in size usually with the help of advanced photolithography and etching techniques. The latter calls for deposition, growing and self-assembly technologies.
According to precision photolithography specialists, nanotechnology can potentially allow engineers to place each atom or molecule in the desired location and position in terms of assembly. There is also the possibility to make almost any structure or material using the limits of physics at the molecular and atomic level.
While MEMS and nanotechnology are known to be separate and distinct technologies, the distinction between one another is not really defined clearly so to speak. As a matter of fact, they rely on one another and are dependent on one another to work. A good example is the tunneling-tip microscope used to detect individual atoms and molecules at the nanometer level is known as a MEMS device. Also, an atomic force microscope that is used to manipulate the position and placement of individual atoms on the surface is also considered a MEMS device too.
Also, a lot of MEMS technologies are quite becoming dependent on nanotechnologies in order to create new product. For instance, the crash airbag accelerometer created using MEMS tech can have their long-term reliability lessened due to stiction effects between the proof mass and the substrate. Nanotechnology made Self-Assembly Monolayers coatings are utilized to treat the surfaces of moving MEMS elements in order to prevent stiction effects from hampering the quality of the product over the long term.
Is nanotechnology and MEMS one and the same? There is definitely a synergy between the two technologies but the most important benefits provided by these technologies overshadow any negative (if any) impact one may have on the other. The fact that using the two allows us to create new materials at miniature dimensional scales, frees us from the limits of space and perhaps even time itself.
MEMS stands for micro-electro-mechanical systems and in its most simple form may be defined as miniaturized and electro mechanical systems that are created from using method of microfabrication. The dimensions of MEMS devices range from well below one micron on the lower end of the dimensional spectrum, all the way to several millimeters. Moreover, MEMS devices range from simple structures with no moving elements to highly sophisticated electromechanical systems that feature multiple moving elements under the supervision of integrated microelectronics.
According to PARCAM with EXT software specialists, the functional elements of MEMS are the microstructures, sensors, actuators and lastly microelectronics, the most important perhaps to mention are the microsensors and microactuators. These are defined as devices that convert energy from one form to the next. For instance, microsensors convert a measured mechanical signal into an electrical one.
During the several decades of MEMS research there has been an extensive number of microsensors for virtually every possible modality including temperature, pressure, inertial forces etc. Micromachined sensors have a knack of exceeding their macroscale counterparts. For instance, a Micromachined version of a pressure transducer can outperform a pressure sensor. Not only does performance better their counterparts but also the batch fabrication methods developed translate into low production cost per device.
There are also a number of exceptionally performing microactuators in the market today such as microvalves used for the control of gas and liquid flows, micropumps to establish positive fluid pressures and controlled micromirror arrays for displays. Even though these devices are so tiny they do have an effect on a macroscale level for instance, small microactuators have been installed on the leading edge of airfoils of planes and have been able to steer said aircraft using only these microminiaturized devices.
The potential of MEMS technology can be appreciated if all the miniaturized sensors, actuators and structures have been combined into a single common silicon substrate along with an integrated circuit. It would really be cool to witness the micromachining process that will selectively etch away parts of the silicon wafer or form new structural layers to create electromechanical and mechanical devices.
Photomasks have a crucial role in the process of microlithography as it is used in the creation of integrated circuits or ICs, phototonic devices as well as micro-electro-mechanical systems also known as MEMS. Photomasks are composed of a fused silica or it could be a glass substrate coated with an opaque film wherein a precise replication of the device designer’s pattern is etched.
How is it made?
Basically, writing the pattern of the designer’s image onto a resist coated chrome mask blank creates a photomask. The latent image is then developed as to form the needed pattern. The function of the resist is it acts as a mask during the etching process. The pattern is conveyed into the chrome film and then the resist layer is removed. Lastly and if the need requires a protective pellicle is attached and the manufacturing process is done.
Types of Photomasks
This kind of photomask is used for hard contact printing so as to transfer the design to their substrates. However, the photomask is prone to deteriorate from mechanical damage. If the feature size and specifications allow, the solution is to use a copy photomask created from a “master” which is then retained if more copies are needed. The copy mask is usually made on soda-lime glass substrates.
If there is a need for close proximity printing or projection aligners to transfer the design to their substrates, there is little damage to the photomask. Similar to hard contact printing this type utilizes broadband or near-UV light in order to expose the wafer. While at the same scale factor (1x) as the final device, higher pattern fidelity and tighter specs can be achieved.
When there is a need to use an optical projection stepper or scanner using a reduction ratio of 2.5:1, 4:1 or maybe 5:1, these are called reticles. They use single wavelengths from i-line (365nm) to deep-UV (248). Reticles are capable of supporting the strictest lithography requirements and in some advanced fabrications the imaging details are tinier than the wavelength of light.
3D printing is now all the rage with the production of 3D homes, 3D devices and virtually almost anything they can get their hands on. However, the problem with 3D printing is that the printed parts may still have some quality problems especially with inconsistent mechanical properties. According to CAD phototooling software experts, a new type of technology has just reached the market, which has the potential of improving the quality of 3D products.
The new technology making waves in the 3D world is called CLIP which stands for “continuous liquid interface production,” it is basically a photochemical process that actually pulls a complete solid product from a melt of plastic material, with mechanical properties, resolution, and surface finishes that are quite similar to injection-molded parts.
According to phototooling design software specialists, CLIP is a variant of the stereolithography process, which calls for using light and oxygen in order to quickly produce objects from a pool of resin. What it does essentially is it grows solid structures out of a liquid bath. According to researchers involved with the technology they have been able to demonstrate continuous generation of monolithic polymeric parts up to ten centimeters in size with feature resolution below 100 micrometers.
The heart of the process calls for the creation of what is called an oxygen-containing “dead zone” between the solid part and the liquid precursor so that solidification does not happen. This dead zone is only about a few tens of a micrometer thick. What happens is that a continuous sequence of UV pictures is fired from a digital light-processing imaging unit in a precise pattern as commanded by the 3D file of the object in question.
What it can do
As a result, engineers can now begin to redesign parts from the ground up without having to be limited by the design rules that are usually associated with traditional manufacturing technologies. Engineers can now make use of lighter parts in terms of weight by utilizing internal mesh structures and single assembly parts that addresses sealing requirements which also reduces the overall complexity of the product assembly. The result is a huge reduction in part and product failure as the new design allows engineers more freedom to be able to do what they want.
Known for being smooth, glass panes are found in fronts of virtually every smartphone and tablet in the market today. However, that smoothness can also become its undoing especially for users trying to manipulate virtual buttons and knobs. According to IGI plotter experts, researchers from Northwestern University think they have found a way on how to offer tactile feedback on a sheet of glass so that the computer screen could not only show what something looks like but also offer a sense of tactile feeling.
About 10 years ago, researchers at Northwestern’s mechanical engineering department found a way to develop a device that can show virtual shapes and textures e.g. Buttons on a touchscreen. They named this device TPaD and it uses ultrasonic waves to a thin glass plate placed over the screen & decreased the friction between the fingertip and the plate. The result? it made the glass feel much more slippery.
Researchers needed to understand how the whole thing works so they can come up with practical applications. According to sculpted patterning experts, there are two competing hypothesis with the first one stating that the vibrating plate was the one that caused the thin film of air beneath the fingertip to compress and thus creating the pressure needed to levitate the skin of the finger off the screen. The second hypothesis calls for the use of ultrasonic vibrations as the cause to why the skin bounces off the surface of the glass.
Researchers from Northwestern built a prototype model or test device, which was essentially a fingerprint imager and connected it to the TPaD, the result showed that both theories were in part true. The result showed that the reduced friction or slipperiness of the glass was caused by the skin bouncing, not on the glass plate but on a layer of air that was trapped between the plate and the surface of the finger. To put it simply, the fingertip is actually bouncing on air.
Researchers now plan to develop new algorithms in order to discover more accurate textures that can lower the power consumption of new kinds of haptic devices. They also want to develop technology that will be used to align fingers of the visually impaired over a keyboard in order to give them more control of what they are typing on screen.