Posted on

Reichert AR360 Auto Refractor: In-Depth Technical Analysis and Operation Guide

I. Product Overview and Technical Background

The Reichert AR360 Auto Refractor, developed by Reichert Ophthalmic Instruments (a subsidiary of Leica Microsystems), represents a cutting-edge electronic refraction device that embodies the technological advancements of the early 21st century in automated optometry. This device incorporates innovative image processing technology and an automatic alignment system, revolutionizing the traditional optometry process that previously required manual adjustments of control rods and chin rests.

The core technological advantage of the AR360 lies in its “hands-free” automatic alignment system. When a patient focuses on a fixed target and rests their forehead against the forehead support, the device automatically identifies the eye position and aligns with the corneal vertex. This breakthrough design not only enhances measurement efficiency (with a single measurement taking only a few seconds) but also significantly improves patient comfort, making it particularly suitable for children, the elderly, and patients with special needs.

As a professional-grade ophthalmic diagnostic device, the AR360 offers a comprehensive measurement range:

  • Sphere: -18.00D to +18.00D (adjustable step sizes of 0.01D/0.12D/0.25D)
  • Cylinder: 0 to 10.00D
  • Axis: 0-180 degrees
    It caters to the full spectrum of refractive error detection, from mild to severe cases.

II. Device Composition and Functional Module Analysis

2.1 Hardware System Architecture

The AR360 features a modular design with the following core components:

Optical Measurement System:

  • Optical path comprising an infrared light source and imaging sensor
  • Built-in self-calibration program (automatically executed upon power-on and after each measurement)
  • Patient observation window with a diameter of 45mm, featuring a built-in green fixation target

Mechanical Positioning System:

  • Translating headrest assembly (integrated L/R detector)
  • Automatic alignment mechanism (accuracy ±0.1mm)
  • Transport locking device (protects internal precision components)

Electronic Control System:

  • Main control board (with ESD electrostatic protection circuitry)
  • PC card upgrade slot (supports remote software updates)
  • RS-232C communication interface (adjustable baud rate from 2400 to 19200)

Human-Machine Interface:

  • 5.6-inch LCD operation screen (adjustable contrast)
  • 6-key membrane control panel
  • Thermal printer (printing speed of 2 lines per second)

2.2 Innovative Functional Features

Compared to contemporary competitors, the AR360 boasts several technological innovations:

  • Smart Measurement Modes: Supports single measurement, 3-average, and 5-average modes to effectively reduce random errors.
  • Vertex Distance Compensation: Offers six preset values (0.0/12.0/13.5/13.75/15.0/16.5mm) to accommodate different frame types.
  • Data Visualization Output: Capable of printing six types of refractive graphs (including emmetropia, myopia, hyperopia, mixed astigmatism, etc.).
  • Multilingual Support: Built-in with six operational interface languages, including English, French, and German.

III. Comprehensive Device Operation Guide

3.1 Initial Setup and Calibration

Unboxing Procedure:

  • Remove the accessory tray (containing power cord, dust cover, printing paper, etc.)
  • Release the transport lock (using the provided screwdriver, turn counterclockwise 6 times)
  • Connect to power (note voltage specifications: 110V/230V)
  • Perform power-on self-test (approximately 30 seconds)

Basic Parameter Configuration:
Through the MODE→SETUP menu, configure:

  • Refractive power step size (0.01/0.12/0.25D)
  • Cylinder display format (negative/positive/mixed cylinder)
  • Automatic measurement switch (recommended to enable)
  • Sleep time (auto-hibernation after 5-90 minutes of inactivity)

3.2 Standard Measurement Procedure

Step-by-Step Instructions:

Patient Preparation:

  • Adjust seat height to ensure the patient is at eye level with the device.
  • Instruct the patient to remove glasses/contact lenses.
  • Explain the fixation target observation instructions.

Right Eye Measurement:

  • Slide the headrest to the right position.
  • Guide the patient to press their forehead firmly against the forehead support.
  • The system automatically completes alignment and measurement (approximately 3-5 seconds).
  • A “beep” sound indicates measurement completion.

Left Eye Measurement:

  • Slide the headrest to the left position and repeat the procedure.
  • Data is automatically associated and stored with the right eye measurement.

Data Management:

  • Use the REVIEW menu to view detailed data.
  • Press the PRINT key to output a report (supports图文混合 printing, i.e., a combination of graphics and text).
  • Press CLEAR DATA to erase current measurement values.

3.3 Handling Special Scenarios

Common Problem Solutions:

Low Confidence Readings: May result from patient blinking or movement. Suggestions:

  • Have the patient blink fully to moisten the cornea.
  • Use tape to temporarily lift a drooping eyelid.
  • Adjust head position to keep eyelashes out of the optical path.

Persistent Alignment Failures:

  • Check the cleanliness of the observation window.
  • Verify ambient lighting (avoid direct strong light).
  • Restart the device to reset the system.

IV. Clinical Data Interpretation and Quality Control

4.1 Measurement Data Analysis

A typical printed report includes:

[Ref] Vertex = 13.75 mmSph   Cyl    Ax-2.25 -1.50  10-2.25 -1.50  10-2.25 -1.50  10Avg  -2.25 -1.50  10

Parameter Explanation:

  • Sph (Sphere): Negative values indicate myopia; positive values indicate hyperopia.
  • Cyl (Cylinder): Represents astigmatism power (axis determined by the Ax value).
  • Vertex Distance: A critical parameter affecting the effective power of the lens.

4.2 Device Accuracy Verification

The AR360 ensures data reliability through a “triple verification mechanism”:

  • Hardware-Level: Automatic optical calibration after each measurement.
  • Algorithm-Level: Exclusion of outliers (automatically flags values with a standard deviation >0.5D).
  • Operational-Level: Support for multiple measurement averaging modes.

Clinical verification data indicates:

  • Sphere Repeatability: ±0.12D (95% confidence interval)
  • Cylinder Axis Repeatability: ±5 degrees
    Meets ISO-9001 medical device certification requirements.

V. Maintenance and Troubleshooting

5.1 Routine Maintenance Protocol

Periodic Maintenance Tasks:

  • Daily: Disinfect the forehead support with 70% alcohol.
  • Weekly: Clean the observation window with dedicated lens paper.
  • Monthly: Lubricate mechanical tracks with silicone-based lubricant.
  • Quarterly: Optical path calibration (requires professional service).

Consumable Replacement:

  • Printing Paper (Model 12441): Standard roll prints approximately 300 times.
  • Fuse Specifications:
    • 110V model: T 0.63AL 250V
    • 230V model: T 0.315AL 250V

5.2 Fault Code Handling

Common Alerts and Solutions:

CodePhenomenonSolution
E01Printer jamReload paper according to door diagram
E05Voltage abnormalityCheck power adapter connection
E12Calibration failurePerform manual calibration procedure
E20Communication errorRestart device or replace RS232 cable

For unresolved faults, contact the authorized service center. Avoid disassembling the device yourself to prevent voiding the warranty.

VI. Technological Expansion and Clinical Applications

6.1 Comparison with Similar Products

Compared to traditional refraction devices, the AR360 offers significant advantages:

  • Efficiency Improvement: Reduces single-eye measurement time from 30 seconds to 5 seconds.
  • Simplified Operation: Reduces manual adjustment steps by 75%.
  • Data Consistency: Eliminates manual interpretation discrepancies (CV value <2%).

6.2 Clinical Value Proposition

  • Mass Screening: Rapid detection in schools, communities, etc.
  • Preoperative Assessment: Provides baseline data for refractive surgeries.
  • Progress Tracking: Establishes long-term refractive development archives.
  • Lens Fitting Guidance: Precisely measures vertex distance for frame adaptation.

VII. Development Prospects and Technological Evolution

Although the AR360 already boasts advanced performance, future advancements can be anticipated:

  • Bluetooth/WiFi wireless data transmission
  • Integrated corneal topography measurement
  • AI-assisted refractive diagnosis algorithms
  • Cloud platform data management

As technology progresses, automated refraction devices will evolve toward being “more intelligent, more integrated, and more convenient,” with the AR360’s design philosophy continuing to influence the development of next-generation products.

This guide provides a comprehensive analysis of the technical principles, operational methods, and clinical value of the Reichert AR360 Auto Refractor. It aims to help users fully leverage the device’s capabilities and deliver more precise vision health services to patients. Regular participation in manufacturer-organized training sessions (at least once a year) is recommended to stay updated on the latest feature enhancements and best practice protocols.

Posted on

MKS PDR-C-2C Power Digital Readout Comprehensive User Guide

Product Overview and Core Features

The MKS PDR-C-2C is a professional-grade power supply and digital readout system designed for industrial pressure monitoring and control applications. As a mature product from MKS Instruments, the PDR-C-2C features a standard half-rack mount design, integrating high-precision power supply and dual-channel pressure signal processing capabilities.

Core Features:

  • Dual-Channel Pressure Monitoring: Connects to two independent pressure sensors for wide-range pressure monitoring.
  • High-Precision Digital Display: 4½-digit LED panel meter provides readings accurate to 0.01%.
  • Programmable Setpoint Control: Equipped with two independent setpoint relays for customizable trigger thresholds.
  • Multi-Unit Display: Supports seven engineering units: mmHg, psi, kPa, mbar, inHg, inH₂O, cmH₂O.
  • Stable Power Output: Provides ±15VDC/600mA dual outputs to meet most pressure sensor requirements.
  • Auto Channel Switching: Intelligently monitors dual-channel pressure values and automatically switches to display the sensor data with the optimal range.

Compared to the single-channel version PDR-C-1C, the PDR-C-2C adds dual-sensor interfaces and intelligent channel management, making it ideal for applications requiring wide-range pressure monitoring. The system’s modular design ensures easy maintenance, with all key parameters adjustable directly from the front panel without the need for specialized tools.

Safety Operating Procedures

As an electronic measurement device, the MKS PDR-C-2C must be used in strict compliance with safety regulations to prevent personal injury and equipment damage.

Electrical Safety Warnings:

  • Grounding Requirements: The device must be properly grounded through the grounding conductor of the power cord. Loss of protective grounding connection may result in all accessible conductive parts (including seemingly insulated knobs and controls) becoming live, posing an electric shock risk.
  • Power Supply Considerations:
    • Use only a power cord that meets specifications (conductor cross-sectional area ≥ 0.75mm²).
    • Use only the specified fuse type (1ASB for 120VAC, ½ASB for 240VAC).
    • Power voltage range: 117/234V ± 15%, 50-60Hz.
  • High Voltage Warning: High voltages are present in cables and sensors when the controller is powered on. Non-professionals are prohibited from opening the device casing.

Operating Environment Requirements:

  • Temperature Range: 0°C to 50°C.
  • Ventilation Requirements: Ensure adequate airflow around the device.
  • Prohibited Environments: Do not use in explosive environments unless the device is certified for such use.

Maintenance Safety:

  • No Unauthorized Modifications: Do not install replacement parts or make any unauthorized modifications to the instrument.
  • Professional Repairs: Component replacement and internal adjustments must be performed by qualified service personnel.
  • Cleaning and Maintenance: Regularly inspect cables for wear and check the casing for visible damage.

Device Installation and Connection

Unpacking Inspection:

Upon receiving the PDR-C-2C device, perform the following checks:

  • Inspect the packaging for any obvious signs of damage.
  • Verify the packing list:
    • Standard configuration: PDR-C-2C host, user manual.
    • Optional accessories: Electrical connector accessory kit (PDR-C-2C-K1), interface cables.

If any damage is found, immediately notify the carrier and MKS. If the device needs to be returned to MKS, first contact the MKS Service Center to obtain an Equipment Return Authorization (ERA) Number.

Mechanical Installation:

The device features a standard 19-inch half-rack design. When installing, note the following:

  • Ensure the installation location has sufficient space for heat dissipation (at least 5cm clearance on both sides recommended).
  • Use appropriate rack mounting hardware to secure the device.
  • Avoid installing in environments with strong vibrations or excessive dust.

Electrical Connections:

Power Connection Steps:

  1. Confirm that the voltage selection card at the rear of the device is set to match the local grid voltage.
  2. Insert a compliant power cord (conductor cross-sectional area ≥ 0.75mm²).
  3. Connect to a properly grounded power outlet.

Pressure Sensor Connection:

The PDR-C-2C provides two 6-position terminal block sensor interfaces. Wiring definitions are as follows:

Terminal PositionSignal DefinitionStandard Wire Color
1Digital Ground (D GND)Black
2Analog Ground (A GND)Black
3+15V Power OutputGreen
4-15V Power OutputWhite
5Pressure Signal InputRed
6Chassis GroundThick Black

Grounding System:

The PDR-C-2C employs a three-ground system:

  • Digital Ground (D GND): Power return path.
  • Analog Ground (A GND): DC output signal return path.
  • Chassis Ground: Device casing ground.

When connecting pressure sensors with only a two-ground system, connect D GND and A GND to the sensor’s common ground, then connect the PDR’s chassis ground to the sensor’s chassis ground.

Front Panel Function Details

The PDR-C-2C front panel is designed for user-friendliness, with all commonly used functions directly operable without navigating complex menus.

Display Area:

  • 4½-Digit LED Display: Red flat LED numeric display, range -19999 to 19999.
  • x10⁻³ Indicator: Illuminates to indicate that the current display value should be multiplied by 0.001.
  • Channel Indicator: Displays the currently active pressure channel (1 or 2).

Function Switches:

  • Power Switch: Controls the main power supply to the device.
  • Engineering Unit Selection Switch: Seven-position rotary switch for selecting units: mmHg, psi, kPa, mbar, inHg, inH₂O, cmH₂O.
  • Channel Selection/Remote/Auto Switch (PDR-C-2C Specific):
    • Position “1”: Fixed display of channel 1.
    • Position “2”: Fixed display of channel 2.
    • “AUTO”: Automatic channel switching mode.
    • “REMOTE”: Allows remote channel selection via the rear interface.

Adjustment Controls:

  • Zero Adjustment (Zero):
    • Used for fine zero-point correction of pressure signals.
    • Adjustment range: ±1.5% full scale.
    • Absolute pressure gauges must be evacuated below their resolution before adjustment.
    • Differential pressure gauges should undergo cross-porting.
  • Setpoint Adjustment (Set Point):
    • Independent Coarse and Fine adjustment knobs for each channel.
    • Adjustment range: 0-100% full-scale pressure.
    • Use the “Read Set Point” switch to view setpoint values in real-time.
  • Setpoint Read Switch (Read Set Point):
    • Middle position: Displays current pressure value.
    • Left position: Displays channel 1 setpoint value.
    • Right position: Displays channel 2 setpoint value.
    • Automatically returns to the middle position after release.

Status Indicators:

  • Setpoint Relay Indicators: LEDs illuminate to indicate that the corresponding relay is energized (pressure below setpoint).
  • Overload Indicator: Blank display indicates that the input signal exceeds approximately 11V.

Rear Panel Interface Details

The PDR-C-2C rear panel contains multiple professional interfaces that extend system functionality.

Pressure Sensor Interfaces:

Two 6-position terminal blocks for connecting pressure sensors. Provides sensor operating power (±15V) and signal input. Each interface includes an independent decimal point selection switch.

Decimal Point Selection Switch:

4PST rocker switch for setting the display decimal point based on sensor range:

RangeSwitch Position
1Switch 1 ON
10Switch 2 ON
100Switch 3 ON
1000Switch 4 ON
10000All OFF

Note: Only one switch per channel should be in the ON position. Simultaneously closing multiple switches may result in abnormal display.

Power Interface Module:

  • Accepts standard power cords.
  • Built-in line filter.
  • Voltage selection card visible behind a plastic window.

Steps for Voltage Replacement:

  1. Unplug the power cord and slide the plastic window to the left.
  2. Pull out the fuse holder to eject the fuse.
  3. Use a probe to remove the voltage selection card.
  4. Reinsert the card with the desired voltage facing outward.
  5. Install the appropriate fuse.
  6. Slide the window to the right.
  7. Insert the power cord.

Interface Connector (J118):

20-pin interface providing external control signal access:

PinFunction Description
1Signal Ground
2Digital Ground
4Switched DC Output (Engineering Unit)
6Setpoint 2 Relay Latch
7Setpoint 1 Relay Latch
8-10Setpoint 1 Relay Contacts (NO/NC/COM)
AChannel 2 Range ID
BChannel 1 Range ID
CRemote Channel Selection
FChannel 2 DC Output (0-10V)
HChannel 1 DC Output (0-10V)
J-LSetpoint 2 Relay Contacts

BCD Output Connector (Optional):

Provides 5V BCD logic output for direct connection to digital devices for remote readout:

  • Data update cycle approximately 0.5 seconds.
  • Includes polarity, overrange, and other status signals.
  • Enables multi-device bus sharing via control lines.

Operating Theory and Work Modes

Pressure Signal Processing Flow:

  1. Sensor signals are input through the rear panel terminal blocks.
  2. Signals pass through an input amplifier (U1) where fine zero-point correction is applied.
  3. Signals are split into three paths:
    • Output buffer amplifiers (U2, U3) → Rear interface.
    • Setpoint comparison circuit.
    • Engineering unit scaling circuit → Display DVM.

Setpoint System:

  • Two independent setpoint relays.
  • Select which pressure signal to monitor via the rear panel switch (PDR-C-2C).
  • Compare input signals with adjustable reference voltages (front panel controls).
  • “Fail-Safe” logic: No power state = High-pressure state.
  • Relay states can be remotely locked via the LATCH lines of the J118 interface.

Auto Channel Switching Logic (PDR-C-2C Specific):

  • Comparator monitors channel 1 signal:
    • 90% full-scale trigger.
    • 100% full-scale trigger.
  • Channel 1 < 90%: Display channel 1.
  • Channel 1 > 100%: Automatically switch to channel 2.
  • Channel 1 drops from > 100% to < 90%: Switch back to channel 1.

Power System:

  • Provides ±15V for internal circuits and sensor power.
  • Overload and overheating protection.
  • Supplies precision reference voltage for comparators.
  • Display DVM has its own +5V power supply.

Advanced Function Configuration

Engineering Unit Calibration:

  1. Short-circuit pressure input to signal ground.
  2. Connect DVM to analog ground and CH1 signal test point.
  3. Power on and adjust ZERO trimmer to display 0.000V ± 0.0005V on DVM.
  4. Apply a 10.0000V ± 0.0005V standard signal.
  5. Adjust the corresponding trimmer resistor based on the selected unit:
UnitTrimmer ResistorTheoretical Display Value
mbarR4713332
kPaR4713332
mmHgR4910000
psiR4419337
cmH₂OR5513597
inH₂OR515353
inHgR533937

Remote Control Interface Applications:

Through the J118 interface, the following functions can be achieved:

  • Remote Channel Selection: Input high level (or leave floating) on pin C to select channel 1, low level to select channel 2.
  • Relay State Locking: Pull pins 6 or 7 low to lock the corresponding relay state.
  • Analog Signal Monitoring:
    • Pin H: Channel 1 0-10V output (with zero-point correction).
    • Pin F: Channel 2 0-10V output.
    • Pin 4: Output after engineering unit switching.

BCD Output Configuration (Optional Function):

  • Data Ready Signal (DATA READY): High level indicates BCD data is valid.
  • Bit Enable Control: Ground the DIGIT ENABLE line to read the corresponding BCD data bit.
  • Polarity Output: Indicates the sign of the reading.
  • Overrange Signal: Indicates that the input exceeds the range.

System Maintenance and Troubleshooting

Daily Maintenance:

  • Regularly inspect cables for wear.
  • Check the casing for visible damage.
  • Clean ventilation holes to ensure good heat dissipation.
  • Verify that all connectors are secure.

Fault Isolation Process:

Power Check:

  • Measure ±15V outputs (relative to P GND).
  • Normal range: 14.8-15.2V.
  • Ripple < 10mVp-p.
  • If abnormal, disconnect sensors and retest.

Signal Path Check:

  • Use 10kΩ and 5.1kΩ resistors to simulate sensor input (should produce 9.6-10.3V).
  • Measure voltages at various test points for normalcy.
  • Check key operational amplifiers such as U1, U2, U3.

Setpoint Circuit Check:

  • Confirm comparator input voltages (should follow setpoint adjustments).
  • Check relay driver circuits (Q5, Q6).
  • Test relay contact states.

Channel Selection Circuit Check (PDR-C-2C):

  • Verify U4, U5 comparator switching points (9V and 10V).
  • Check relay K1 switching state.
  • Test remote selection logic (U6).

Common Issue Handling:

Issue 1: Inaccurate Display

  • Check sensor connections.
  • Verify decimal point switch settings.
  • Recalibrate engineering units.

Issue 2: Relays Do Not Actuate

  • Check setpoint adjustments.
  • Measure comparator outputs.
  • Verify relay driver voltages.

Issue 3: Auto Switching Fails

  • Check if channel 1 signal reaches switching thresholds.
  • Verify U4, U5 comparator operation.
  • Test channel selection relay.

Technical Specifications and Model Descriptions

Physical Specifications:

  • Dimensions: Standard half-rack width.
  • Display: 4½-digit red LED.
  • Weight: 3.2kg.
  • Connectors: 20-pin interface, 6-position terminal block.

Electrical Specifications:

  • Power Consumption: 65W (full load).
  • Operating Voltage: 117/234V ± 15%, 50-60Hz.
  • Power Output: ±15V @ 600mA.
  • Analog Output: 0-10V (10kΩ load).
  • Meter Accuracy: 0.01% reading ± 1 digit.
  • Input Impedance: 900kΩ.

Setpoint Specifications:

  • Relay Configuration: Single-pole double-throw (SPDT).
  • Contact Rating: 2A @ 28VDC or 1A @ 120VAC.
  • Hysteresis: 0.5% full scale.
  • Adjustment Range: 100% full scale.

Model Coding:

PDRCXXYYY format:

  • XX: Channel count (2C indicates dual-channel).
  • YYY: Options (BCD indicates BCD output, E indicates CE certification).

Application Cases and Best Practices

Wide-Range Pressure Monitoring System:

Configuration Recommendations:

  • Connect a high-precision low-pressure sensor (e.g., 10Torr) to channel 1.
  • Connect a large-range sensor (e.g., 1000Torr) to channel 2.
  • Set to “AUTO” mode for seamless range switching.
  • Use setpoint 1 for low-pressure alarms and setpoint 2 for high-pressure alarms.

Industrial Process Control Integration:

Integration Scheme:

  • Connect to PLC via the J118 interface.
  • Use 0-10V outputs for pressure monitoring.
  • Obtain relay states via digital lines.
  • Remotely switch display channels.
  • Connect BCD interface to digital recorders.
  • Use setpoints to control safety valves or alarms.

Maintenance Tips:

  • Regular Calibration:
    • Zero-point calibration at least annually.
    • Full-scale calibration every two years.
  • Sensor Connection:
    • Use shielded cables to reduce interference.
    • Avoid running parallel to power lines.
  • Environmental Control:
    • Keep the working environment clean.
    • Control ambient temperature within recommended ranges.

Appendix: Compatible Sensors and Accessories

Compatible Pressure Sensors:

MKS Baratron Series Compatible Sensors:

ModelRemarks
121
221-224
622-628628 only supports single-channel
722

Recommended Accessories:

  • Electrical Connection Kit: PDR-C-2C-K1.
    • Includes all necessary connectors for installation.
    • Provides spare fuses.
  • Interface Cables:
    • 20-pin interface extension cable.
    • BCD output cable.
  • Calibration Tools:
    • Precision voltage source.
    • High-precision digital multimeter.

By systematically studying this guide, users should be able to fully master the various functions and operating methods of the MKS PDR-C-2C Power Digital Readout, leveraging its high-performance advantages in practical applications to provide reliable solutions for industrial pressure monitoring and control.

Posted on

Comprehensive User Guide for Hash HQ30D Series Dissolved Oxygen Meters

Chapter 1: Product Overview and Technical Specifications

1.1 Introduction to HQ30D Series Products

The Hash HQ30D series dissolved oxygen meters are high-performance portable instruments developed by Hash Company. Utilizing advanced polarographic sensor technology, these meters are widely applied in environmental monitoring, wastewater treatment, aquaculture, and scientific research. Renowned for their high precision, stability, and portability, the HQ30D series meets the dissolved oxygen measurement needs in various complex environments. The series includes multiple models, allowing users to select the most suitable one based on their requirements. All models employ the same core measurement technology, ensuring consistent and reliable results.

1.2 Key Technical Specifications

Measurement Performance Indicators:

  • Measurement Range: 0–20 mg/L (ppm) or 0–200% saturation
  • Resolution: 0.01 mg/L or 0.1% saturation
  • Accuracy: ±0.1 mg/L or ±1.5% of the reading (whichever is greater)
  • Response Time: <30 seconds to reach 90% of the final value (at 25°C water sample)

Environmental Adaptability:

  • Operating Temperature Range: 0–50°C
  • Storage Temperature Range: -20–60°C
  • Protection Class: IP67 (fully dustproof and waterproof for short-term immersion)
  • Power Supply: 6-12V DC adapter or 4 AA alkaline batteries
  • Battery Life: Approximately 40 hours of continuous use (with new batteries)

Physical Characteristics:

  • Host Dimensions: 215 × 87 × 42 mm
  • Weight: Approximately 520 g (including batteries)
  • Display: 4-digit LCD with backlight

Chapter 2: Instrument Components and Installation

2.1 Standard Accessories List

Standard Configuration:

  • HQ30D host unit (1)
  • LDO101 dissolved oxygen electrode (1)
  • Power adapter (input: 100-240V AC, output: 6-12V DC)
  • 4 AA alkaline batteries (pre-installed)
  • Portable carrying case (1)
  • User manual and certificate of conformity (1 each)

Optional Accessories:

  • Spare electrode membrane kit (including electrolyte)
  • BOD measurement kit
  • Dissolved oxygen standard calibration solution set
  • Data cable and printing accessories

2.2 Instrument Assembly Steps

Battery Installation Procedure:

  1. Place the instrument upside down on a stable surface.
  2. Locate the battery compartment cover at the bottom and slide to unlock.
  3. Insert 4 AA batteries according to the polarity markings inside the compartment.
  4. Ensure proper battery contact and close the compartment cover.

Electrode Connection Method:

  1. Remove the electrode protective cap.
  2. Insert the electrode into the dedicated interface on the top of the host unit.
  3. Rotate the locking ring clockwise until securely fastened.
  4. Check the connection for stability and ensure no loosening.

Initial Use Preparation:

  • Activate the new electrode by soaking it in clean water for 2-4 hours.
  • Perform a complete calibration procedure before the first use.
  • Check the connections of all components for firmness.

Chapter 3: Basic Operation and Calibration

3.1 Power-On and Interface Navigation

Power-On Procedure:

  1. Press and hold the power button for 2 seconds to start the instrument.
  2. After system self-check, the main interface will be displayed.
  3. The default display shows the dissolved oxygen concentration (mg/L).

Interface Functional Areas:

  • Main Display Area: Real-time measurement value
  • Status Indicator Area: Battery level, calibration status, and other icons
  • Unit Display: Current measurement unit (mg/L or %)

Basic Button Functions:

  • Power Button: Power on/off and backlight activation
  • Mode Button: Switch between display modes
  • Calibration Button: Enter calibration program
  • Setting Button: Parameter configuration menu
  • Up/Down Buttons: Numerical adjustment and menu navigation

3.2 Zero Calibration Procedure

Preparation:

  • Prepare a zero-oxygen solution (0.25 g anhydrous sodium sulfite dissolved in 250 mL distilled water).
  • Ensure the electrode is clean and free from contamination.
  • Power on the instrument and allow it to warm up for 5 minutes.

Calibration Steps:

  1. Immerse the electrode in the zero-oxygen solution.
  2. Press the calibration button to enter the calibration menu.
  3. Select “Zero Calibration.”
  4. Wait for the reading to stabilize (approximately 3-5 minutes).
  5. Confirm that the calibration value displays 0.00 mg/L.
  6. Press the confirm button to complete the zero calibration.

3.3 Full-Scale Calibration Procedure

Preparation:

  • Prepare a saturated dissolved oxygen water sample (vigorously shake for 5 minutes) or use a dedicated saturated oxygen standard solution.
  • Ensure the water sample temperature is stable at 20-25°C.

Calibration Steps:

  1. Immerse the electrode in the saturated oxygen water sample.
  2. Press the calibration button to enter the calibration menu.
  3. Select “100% Calibration.”
  4. Gently stir the electrode to ensure water sample flow.
  5. Wait for the reading to stabilize (display shows “Stabilizing…”).
  6. Confirm that the reading is close to the theoretical saturation value.
  7. Press the confirm button to complete the full-scale calibration.

Chapter 4: Measurement Operation and Data Processing

4.1 Standard Measurement Procedure

Standard Measurement Steps:

  1. Immerse the electrode in the water sample to be tested.
  2. Ensure the electrode is in full contact with the water sample.
  3. Gently stir the electrode (approximately 2-3 times per second).
  4. Wait for the reading to stabilize (approximately 30-60 seconds).
  5. Record the measurement result.

Precautions:

  • Avoid vigorous stirring to prevent bubble formation.
  • Keep the electrode membrane surface clean.
  • Recommend measuring at a depth of 5-10 cm below the water surface.
  • Avoid direct sunlight exposure to the measurement area.

4.2 Data Recording and Storage

Manual Data Recording:

  1. After the measurement value stabilizes, press the storage button.
  2. Enter the sample number (optional).
  3. The measurement time and value will be automatically recorded.
  4. Add remarks (such as sampling location) if necessary.

Automatic Storage Function:

  • Set up timed automatic storage.
  • Storage interval adjustable from 1-60 minutes.
  • Maximum storage capacity of 500 data sets.

Data Query Method:

  1. Press the menu button to enter data management.
  2. Select “Data Review.”
  3. Search for records by date or number.
  4. View detailed measurement information.

4.3 Data Export and Printing

Computer Connection:

  1. Connect the instrument to a PC using a dedicated data cable.
  2. Install Hash data management software.
  3. Set communication parameters (9600 baud rate).
  4. Export data in Excel or text format.

Printing Output:

  1. Connect a compatible micro-printer.
  2. Select the data to be printed.
  3. Print single measurements or batch data.
  4. Printed content includes measurement values, time, and other information.

Chapter 5: Advanced Function Applications

5.1 BOD Measurement Mode

BOD5 Measurement Preparation:

  • Prepare a 300 mL BOD incubation bottle.
  • Collect representative water samples.
  • Dilute as necessary.

Measurement Steps:

  1. Measure the initial DO value (D1) of the sample.
  2. Seal the incubation bottle and place it in a 20 ± 1°C environment.
  3. After 5 days, measure the final DO value (D2).
  4. Calculate BOD5 = D1 – D2 (considering dilution factor).

Precautions:

  • Use a dedicated BOD bottle cap to ensure sealing.
  • Avoid light exposure during incubation.
  • Verify high BOD samples through multiple dilutions.

5.2 Salinity and Barometric Pressure Compensation

Salinity Compensation Setting:

  1. Press the setting button to enter the parameter menu.
  2. Select “Salinity Compensation.”
  3. Enter the actual salinity value of the water sample (0-40 ppt).
  4. Confirm to automatically apply the compensation algorithm.

Barometric Pressure Compensation Setting:

  1. Enter the setting menu and select “Barometric.”
  2. Manually enter the local barometric pressure value or select “Auto” to use the built-in sensor.
  3. Confirm to automatically adjust saturation calculations.

Temperature Compensation:

  • Automatically compensates based on the built-in temperature sensor.
  • Ensure the temperature probe is clean and free from contamination.
  • Check the temperature sensor if abnormal temperature readings are displayed.

Chapter 6: Maintenance and Troubleshooting

6.1 Daily Maintenance Points

Electrode Maintenance:

  • Replace the electrolyte and membrane kit monthly.
  • Clean the electrode surface after use.
  • Keep the electrode moist during short-term storage.
  • Store dry during long-term storage.

Instrument Cleaning:

  • Regularly wipe the exterior with a damp cloth.
  • Avoid using organic solvents.
  • Keep the interface dry and clean.
  • Check the battery compartment for corrosion.

Calibration Recommendations:

  • Check the zero point before daily use.
  • Perform full-scale calibration weekly.
  • Recalibrate after replacing the electrolyte.
  • Calibrate before use after long-term storage.

6.2 Common Fault Handling

Display Issues:

  • No display: Check battery/power connections.
  • Blurry display: Replace batteries or adjust contrast.
  • Backlight not illuminated: Check settings or battery level.

Measurement Abnormalities:

  • Unstable readings: Clean the electrode and check connections.
  • Slow response: Replace the electrolyte and membrane.
  • Calibration failure: Check calibration solution and confirm electrode status.

Error Codes:

  • Err 1: Sensor failure, check the electrode.
  • Err 2: Out of range, dilute the sample.
  • Err 3: Calibration error, recalibrate.
  • Err 4: Temperature sensor abnormality.

Chapter 7: Safety Regulations and Technical Support

7.1 Safety Operation Regulations

Electrical Safety:

  • Use only the original power adapter.
  • Do not use Ni-Cd rechargeable batteries.
  • Avoid charging in humid environments.

Chemical Safety:

  • Wear protective equipment when handling chemical reagents.
  • Rinse immediately if electrolyte contacts the skin.
  • Dispose of waste chemicals according to regulations.

Operational Safety:

  • Do not immerse the instrument in deep water.
  • Avoid strong vibrations or drops.
  • Avoid prolonged use in high-temperature environments.

7.2 Service and Support

Warranty Policy:

  • Host unit warranty period: 12 months.
  • Electrode warranty period: 6 months.
  • Damage caused by human factors is not covered by the warranty.

Repair Services:

  • Authorized repair centers nationwide provide services.
  • Provide the product serial number for repairs.
  • Non-professionals should not disassemble the instrument.

Chapter 8: Practical Application Tips

8.1 Methods to Improve Measurement Accuracy

Sample Handling Techniques:

  • Allow the sample to stand for 2-3 minutes before measurement.
  • Maintain stable sample temperature.
  • Avoid gas exchange during sample transfer.

Electrode Usage Techniques:

  • Regularly polish the electrode surface.
  • Keep the membrane moist during storage.
  • Avoid scratching the membrane surface.

Environmental Control Points:

  • Avoid strong electromagnetic interference sources.
  • Maintain stable temperature in the measurement environment.
  • Accurately set compensation for high-salinity samples.

8.2 Handling Special Application Scenarios

Low Dissolved Oxygen Measurement:

  • Use fresh zero-oxygen solution for calibration.
  • Extend the stabilization time.
  • Use a flow measurement cell to reduce interference.

High-Salinity Water Samples:

  • Accurately measure and input the salinity value.
  • Consider using a high-salinity dedicated electrode.
  • Increase calibration frequency.

Flowing Water Body Measurement:

  • Use a flow adapter to fix the electrode.
  • Select representative measurement positions.
  • Avoid turbulence and bubble interference.

Conclusion

The Hash HQ30D series dissolved oxygen meters are comprehensive and user-friendly professional water quality analysis instruments. Through the systematic introduction in this guide, users should be able to master the various functions and maintenance points of the instrument proficiently. Correct usage methods and regular maintenance not only ensure the accuracy of measurement data but also extend the instrument’s service life.

As a key indicator in water quality monitoring, accurate dissolved oxygen measurement is crucial for water environment management. We hope this guide helps users fully leverage the performance advantages of the HQ30D dissolved oxygen meters, providing reliable technical support for water quality monitoring work. For further technical assistance, please feel free to contact Hash Company’s professional service team at any time.

Posted on

Comprehensive User Guide for Hach Sension6 Portable Dissolved Oxygen Meter

Preface: Overview of Dissolved Oxygen Measurement Technology and Instruments

Dissolved oxygen (DO) is a crucial parameter in water quality monitoring, reflecting the self-purification capacity of water bodies and the health of ecosystems. The Hach Sension6 portable dissolved oxygen meter employs polarographic sensor technology, offering a measurement range of 0-20 mg/L (ppm) and 0-200% saturation, with an accuracy of 0.01 mg/L and 0.1% saturation. It supports dual power supply options (6-12V adapter or 4 AA alkaline batteries), complies with an IP67 protection rating, and features built-in data storage functionality. Data can be transferred to a computer or printer via an RS232 interface. This guide aims to assist users in comprehensively mastering the instrument’s operation, maintenance, and troubleshooting methods.

Chapter 1: Instrument Structure and Function Details

1.1 Instrument Composition and Standard Accessories

Standard Configuration:

  • Main unit (including electrode holder)
  • Dissolved oxygen electrode
  • Power adapter (Product No.: 9185600)
  • 4 AA alkaline batteries
  • Data transfer cable (RS232 port, black)
  • Operation manual and certificate of conformity

Optional Accessories:

  • BOD measurement kit (Product No.: 51971-00)
  • 100 mg/L dissolved oxygen standard solution (100 mL, Product No.: 21503-42)
  • Citizen PN60 micro-printer (Product No.: 26687-00)
  • Spare dissolved oxygen electrode membrane (4/pkg, Product No.: 27584-00)

1.2 Instrument Technical Specifications

Measurement Performance:

  • Measurement range: 0~20 mg/L (ppm), 0~200% saturation
  • Resolution: 0.01 mg/L, 0.1% saturation
  • Accuracy: ±0.1 mg/L or ±1.5% of reading (whichever is greater)
  • Response time: <30 seconds to reach 90% of final value (at 25°C water sample)

Environmental Adaptability:

  • Operating temperature: 0~50°C
  • Storage temperature: -20~60°C
  • Protection rating: IP67 (dust-tight and waterproof)
  • Power supply: 6-12V DC adapter or 4 AA alkaline batteries
  • Battery life: Approximately 6 months (under normal use)

Physical Characteristics:

  • Dimensions: 21.2 × 8.7 × 4.2 cm
  • Weight: Approximately 500 g (including batteries)
  • Display: 4-digit LCD, 1.5 cm character height

1.3 Keyboard Function Details

Main Function Keys:

  • SETUP/CE: Enter setup menu or clear current input
  • READ/ENTER: Confirm selection or start measurement
  • EXIT: Exit current menu or cancel operation

Auxiliary Function Keys:

  • CONC%: Switch between concentration (mg/L) and saturation (%) display
  • STORE: Store current measurement data
  • RECALL: Retrieve historically stored data
  • TIME/DATE: View or set time and date
  • PRINT: Print data via RS232 interface

Navigation Keys:

  • ▲/▼: Move up or down in the menu to select items

Chapter 2: Initial Instrument Setup and Calibration

2.1 Power Management and Battery Installation

Battery Installation Steps:

  1. Place the instrument upside down on a soft pad.
  2. Open the battery compartment cover at the bottom.
  3. Insert 4 AA alkaline batteries according to the marked direction (do not use Ni-Cd rechargeable batteries).
  4. Close the battery compartment cover.

Notes:

  • The display will show “LOW BATTERY” when the battery level is low.
  • It is recommended to remove the batteries if the instrument is not in use for an extended period.
  • After replacing the batteries, the time and date need to be reset.

2.2 Basic Parameter Settings

Date Setting:

  1. Press the SETUP/CE key to enter the setup menu.
  2. Select the “Date” option.
  3. Enter the current date (format: MM/DD/YY).
  4. Press READ/ENTER to confirm.

Time Setting:

  1. In the setup menu, select “Time”.
  2. Enter the time in 24-hour format (e.g., 14:00).
  3. Press READ/ENTER to confirm.

Unit Setting:

  1. Enter the setup menu and select “Units”.
  2. Choose mg/L or % saturation as the default display unit.
  3. Press READ/ENTER to confirm.

2.3 Sensor Installation and Preparation

Dissolved Oxygen Electrode Installation:

  1. Insert the electrode into the electrode socket on the top of the instrument.
  2. Rotate the locking ring clockwise to secure the electrode.
  3. Ensure the electrode is firmly connected to the instrument.

Electrode Activation:

  • For initial use or after long-term storage, immerse the electrode in water for at least 2 hours.
  • Regularly check if the electrode membrane is intact, without damage or contamination.
  • Keep the surface of the electrode membrane clean and avoid scratching it.

Chapter 3: Dissolved Oxygen Measurement Operation Process

3.1 Zero Calibration (Zero Oxygen Calibration)

Preparation of Zero Oxygen Solution:

  • Take 250 mL of distilled water and add 0.25 g of anhydrous sodium sulfite.
  • Stir until completely dissolved (to create a zero-oxygen environment).

Calibration Steps:

  1. Immerse the electrode in the zero-oxygen solution.
  2. Press the SETUP/CE key to enter the setup menu.
  3. Select “Calibration” → “Zero Cal”.
  4. Wait for the reading to stabilize (about 3-5 minutes).
  5. Press READ/ENTER to confirm the zero point.
  6. Press EXIT to exit the calibration mode.

3.2 Full-Scale Calibration (100% Saturation Calibration)

Preparation of Saturated Oxygen Water:

  • Take 150 mL of distilled water and shake vigorously for 5 minutes.
  • Alternatively, use a specially prepared saturated dissolved oxygen standard solution.

Calibration Steps:

  1. Immerse the electrode in the saturated oxygen water.
  2. Press the SETUP/CE key to enter the setup menu.
  3. Select “Calibration” → “100% Cal”.
  4. Wait for the reading to stabilize (display “Stabilizing…”).
  5. Press READ/ENTER to confirm the full-scale value.
  6. Press EXIT to exit the calibration mode.

3.3 Sample Measurement

Standard Measurement Process:

  1. Immerse the electrode in the water sample to be tested.
  2. Gently stir the electrode to keep the water sample flowing (avoid generating bubbles).
  3. Wait for the reading to stabilize (about 30-60 seconds).
  4. Press the CONC% key to switch between mg/L and % saturation display.
  5. Record the measurement result.

Notes:

  • Avoid direct sunlight on the sample during measurement.
  • Keep the temperature of the water sample stable (temperature changes affect dissolved oxygen).
  • For high-salinity samples, set the salinity compensation.

3.4 Salinity and Barometric Pressure Compensation

Salinity Compensation Setting:

  1. Press SETUP/CE to enter the setup menu.
  2. Select the “Salinity” option.
  3. Enter the salinity value of the sample (0-42 ppt).
  4. Press READ/ENTER to confirm.

Barometric Pressure Compensation Setting:

  1. Enter the setup menu and select “Barometer”.
  2. Enter the local atmospheric pressure value (mmHg or inHg).
  • Or select “Auto” to automatically obtain barometric pressure data.
  1. Press READ/ENTER to confirm.

Chapter 4: Advanced Function Applications

4.1 BOD Measurement Mode

BOD Measurement Steps:

  1. Prepare a 300 mL BOD sample bottle.
  2. Initially measure the DO value of the sample and record it.
  3. Place the sample bottle in a 20°C incubator for 5 days.
  4. After 5 days, measure the DO value again.
  5. Calculate the BOD value (initial DO – final DO).

Notes:

  • Use a dedicated BOD bottle cap to prevent gas exchange.
  • Keep the incubation temperature constant at 20 ± 1°C.
  • For high-BOD samples, appropriate dilution may be required.

4.2 Data Storage and Retrieval

Data Storage:

  1. After the measurement result is displayed, press the STORE key.
  2. Enter the sample number (automatically or manually).
  3. Press READ/ENTER to confirm storage.

Data Retrieval:

  1. Press the RECALL key to enter the data review menu.
  2. Use the ▲/▼ keys to select the sample number.
  3. Press READ/ENTER to view detailed data.
  4. Press TIME/DATE to view the storage time.

Data Management:

  • Can store up to 99 sets of measurement data.
  • Supports deleting a single set of data by number.
  • Can delete all stored data at once.

4.3 Data Output and Printing

RS232 Interface Connection:

  1. Use the dedicated data cable to connect the instrument to a computer/printer.
  2. Set the communication parameters (9600 baud rate, 8 data bits, no parity).
  3. Press the PRINT key to send data.

Printing Options:

  • Print the current measurement value.
  • Print specified stored data.
  • Print all stored data.

Computer Connection:

  1. Install the HachLink™ software.
  2. Set up a hyperterminal to receive data.
  3. Enable automatic data collection and storage.

Chapter 5: Instrument Maintenance and Troubleshooting

5.1 Daily Maintenance Points

Electrode Maintenance:

  • Regularly replace the electrolyte and membrane (recommended every 1-2 months).
  • Clean the electrode surface to avoid contamination.
  • Keep the electrode moist during short-term storage.
  • Store the electrode dry during long-term storage.

Instrument Cleaning:

  • Wipe the outer shell with a damp cloth.
  • Avoid using organic solvents.
  • Keep the keyboard and interface dry.

Calibration Recommendations:

  • Perform zero calibration before using the instrument each day.
  • Perform full-scale calibration once a week.
  • Recalibrate after replacing the electrolyte or membrane.

5.2 Common Faults and Troubleshooting

Display Problems:

  • No display: Check battery installation and power connection.
  • Blurry display: Adjust the contrast or replace the batteries.
  • “LOW BATTERY”: Replace all 4 batteries.

Measurement Abnormalities:

  • Unstable readings: Check the electrode connection and clean the electrode.
  • Slow response: Replace the electrolyte and membrane.
  • Calibration failure: Check the calibration solution and confirm the electrode status.

Error Codes:

  • Err 1: Sensor failure, check the electrode connection.
  • Err 2: Out of measurement range, dilute the sample.
  • Err 3: Calibration error, recalibrate.

Chapter 6: Safety Regulations and Quality Assurance

6.1 Safety Operation Regulations

Danger Warnings:

  • Do not use Ni-Cd rechargeable batteries as there is a risk of explosion.
  • Avoid contact of the electrode with strong acid and alkali solutions.
  • Do not immerse the instrument in water (although it has an IP67 protection rating).

Operation Precautions:

  • Wear protective equipment when handling chemical reagents.
  • Use standard solutions according to the instructions.
  • Dispose of used electrolyte as hazardous waste.

6.2 Quality Assurance and Service Support

Warranty Policy:

  • Standard warranty period is 1 year (from the date of shipment).
  • Covers material and workmanship defects.
  • Unauthorized disassembly will void the warranty.

Maintenance Services:

  • Users are not allowed to repair any parts other than the batteries by themselves.
  • Contact an authorized service center for handling.
  • Provide the instrument model and serial number when requesting maintenance.

Chapter 7: Practical Application Tips

7.1 Tips for Improving Measurement Accuracy

Sample Handling:

  • Avoid vigorous shaking to prevent bubble generation.
  • Keep the sample temperature stable.
  • Allow the electrode to acclimate to the sample temperature before measurement.

Electrode Maintenance:

  • Regularly replace the electrolyte and membrane.
  • Keep the membrane moist during storage.
  • Clean the electrode gently with a soft cloth.

Environmental Control:

  • Avoid strong electromagnetic interference.
  • Keep the measurement environment temperature stable.
  • Set the correct salinity compensation for high-salinity samples.

7.2 Handling Special Application Scenarios

Low Dissolved Oxygen Measurement:

  • Use zero calibration to improve accuracy at the low end.
  • Extend the stabilization time.
  • Avoid contact between the sample and air.

High-Salinity Water Samples:

  • Accurately set the salinity compensation value.
  • Consider using a dedicated high-salinity electrode.
  • Increase the calibration frequency.

Flowing Water Body Measurement:

  • Ensure sufficient contact between the electrode and the water.
  • Use a flow cell attachment.
  • Avoid measurement positions with eddies or bubbles.

Conclusion

The Hach Sension6 portable dissolved oxygen meter is a fully functional and easy-to-operate professional water quality analysis instrument. Through the systematic introduction in this guide, users should be able to proficiently master all functions of the instrument, from basic operations to advanced applications. Correct operation methods and regular maintenance can not only ensure the accuracy of measurement data but also extend the service life of the instrument. When encountering problems that cannot be resolved, promptly contact the professional technical service personnel of Hach Company to avoid improper operation causing instrument damage or data loss.

Dissolved oxygen monitoring plays an irreplaceable role in water environment protection, aquaculture, and sewage treatment. It is hoped that this guide can help users fully leverage the performance advantages of the Sension6 portable dissolved oxygen meter, providing reliable technical support for water quality monitoring work and jointly safeguarding the health of our water environment.

Posted on

Comprehensive User Guide for Hach DR1010 COD Determinator

Preface: The Importance of COD Determination Technology and an Overview of the Instrument

Chemical Oxygen Demand (COD) is a crucial indicator in water quality monitoring, reflecting the extent of water pollution caused by reducing substances. The Hach DR1010 COD Determinator, a professional water quality analysis instrument, is widely used in environmental monitoring, sewage treatment, and industrial wastewater testing. This guide aims to comprehensively analyze the operational procedures, functional features, maintenance, and troubleshooting methods of the DR1010 based on the user manual, helping users obtain accurate and reliable test results.

Developed by Hach Company, the DR1010 COD Determinator is controlled by a microprocessor and features an LED light source, suitable for laboratory or on-site measurements. It has four built-in COD test programs, supports user-created curves, and can store up to 40 user programs. The instrument offers flexible power supply options, including a 6V adapter or four AA alkaline dry batteries, operates within a temperature range of 0 to 50°C, and meets the IP41 protection standard.

Chapter 1: Instrument Structure and Function Details

1.1 Instrument Composition and Standard Accessories

The standard configuration of the DR1010 COD Determinator includes:

  • Power adapter (Product No.: 9185600)
  • Data transfer cable (RS232 port, black)
  • Document bag (containing operation manual, method manual, and certificate of conformity)

Optional accessories:

  • COD test tubes (16mm × 100mm, with tube caps)
  • Data printing cable (RS232 port, gray)
  • DRB200 digestor
  • Bottle-top dispensers
  • Pipettes

1.2 Instrument Technical Parameters

  • Wavelength range: 420nm and 610nm dual wavelengths
  • Wavelength accuracy: ±1nm
  • Photometric measurement linearity: ±0.002A (0-1A)
  • Photometric measurement repeatability: ±0.005A (0-1A)
  • Light source: LED
  • Detector: Silicon photodiode
  • Data display: Four-digit LCD, 1.5 cm character height
  • Readout modes: % transmittance, absorbance, concentration
  • External output: RS232 serial port
  • Power supply: 190~240VAC/50Hz adapter or four AA alkaline batteries
  • Instrument dimensions: 24.0 × 19.8 × 12.0 cm
  • Instrument weight: 2 kg
  • Operating temperature: 0 to 50°C
  • Storage temperature: -20 to 60°C

1.3 Keyboard Function Details

Program Selection Keys:

  • High-range 2h: Selects the high-range two-hour digestion method; acts as the number key 7 in numeric mode.
  • Low-range 2h: Selects the low-range two-hour digestion method; acts as the number key 4 in numeric mode.
  • High-range rapid: Selects the high-range 15-minute digestion method; acts as the number key 1 in numeric mode.
  • Low-range rapid: Selects the low-range 15-minute digestion method; acts as the number key 1 in numeric mode.

Function Keys:

  • Print: Prints current data; acts as the number key 8 in numeric mode.
  • Save: Stores the current reading; acts as the number key 5 in numeric mode.
  • Historical data: Retrieves stored sample data; acts as the number key 2 in numeric mode.
  • Zero: Uses the current sample blank for zero adjustment; acts as the number key 0 in numeric mode.
  • Setup: Enters the setup menu; acts as the number key 9 in numeric mode.
  • Time/Date: Displays the current time or date; acts as the number key 6 in numeric mode.
  • Unit conversion: Converts between concentration, absorbance, and % transmittance; acts as the number key 3 in numeric mode.
  • Read: Reads and displays the sample concentration; inputs decimal points or switches between positive and negative signs in numeric mode.
  • Return: Cancels the current input or selection.
  • △/▽: Scrolls up and down within the menu.
  • Enter: Selects a menu item or accepts an input value.

Chapter 2: Initial Instrument Setup and Calibration

2.1 Battery Installation and Power Management

  • Turn the instrument over and ensure the sample cell is empty.
  • Open the battery compartment cover and install four AA alkaline batteries according to the markings.
  • Re-cover the battery compartment and turn the instrument back to its upright position.

Important Tips:

  • Use alkaline batteries. Do not use rechargeable Ni-Cd batteries.
  • Replace all batteries when changing them.
  • When the battery level is low, the LOW BATTERY icon will be displayed. Replace the batteries promptly.
  • It is recommended to remove the batteries if the instrument is not used for an extended period.

2.2 Date and Time Setup

Date Setup:

  • Press the “Setup” key to enter the SETUP menu.
  • Select the DATE option and input the four-digit year, month, and day.
  • Press the “Enter” key to confirm.

Time Setup:

  • In the SETUP menu, select the TIME option.
  • Input the time in 24-hour format.
  • Press the “Enter” key to confirm.

2.3 Proper Use of Sample Tubes

  • Wipe the outer surface of the sample tube with a lint-free cloth.
  • Insert the tube into the instrument’s tube holder, with the HACH logo facing the display.
  • Ensure consistent insertion direction for each measurement.
  • Check that the sample tube is clean and free of scratches before measurement.

Chapter 3: Detailed Instrument Operation Procedures

3.1 Basic Measurement Steps

Determinator Setup:

  • Upon startup, the instrument automatically enters the program used last time.
  • Press the corresponding program key to select a program and press the “Enter” key to confirm.

Sample Preparation:

  • Prepare the zero solution and the sample to be tested according to the program instructions.

Instrument Zeroing:

  • Place the blank solution in the sample cell.
  • Close the cover and press the “Zero” key.
  • When the instrument displays 0 and the READ icon appears, measurement can begin.

Sample Measurement:

  • Place the sample to be tested in the holder.
  • Close the cover and press the “Read” key.
  • The display shows the measurement result.
  • Press the “Unit conversion” key to switch the display mode.

3.2 Standard Curve Adjustment Method

  • Prepare standard solutions.
  • Measure the standard solutions as samples in the program.
  • After obtaining the readings, press the “Setup” key and scroll to the “STD” setting item.
  • Input the actual concentration of the standard solution and press the “Enter” key.

Notes:

  • Consider sample interference before adjustment.
  • After adjustment, test multiple concentration standard solutions to verify the applicability of the curve.
  • If the input calibration value is out of range, the instrument will emit a beep to indicate an error.

3.3 Data Storage and Retrieval

Data Storage:

  • After the measurement result is displayed, press the “Save” key.
  • The display shows the next available storage sequence number.
  • Press the “Enter” key to accept or input a specific sequence number.

Data Retrieval:

  • Press the “Historical data” key to enter the RECALL menu.
  • Use the “▽” or “△” key or numeric keys to select the sample sequence number.
  • Press the “Enter” key to display the stored data.

Chapter 4: Advanced Function Applications

4.1 User Program Creation Method

  • Press the “Setup” key and select the USER option.
  • Input the program number to be created (20-59).
  • Select the wavelength.
  • Prepare standard solutions and perform zero adjustment on the instrument.
  • Measure the absorbance values of the standard solutions.
  • Repeat the steps to complete the input of all standard points.
  • Press the “Return” key and select to store the program.

Key Points:

  • A minimum of 2 data points and a maximum of 12 are required.
  • At 420nm, the absorbance should decrease as the concentration increases.
  • At 610nm, the absorbance should increase as the concentration increases.
  • The instrument will ignore identical absorbance values and emit a beep.

4.2 Data Transmission and Printing

Printer Connection:

  • Connect the instrument and the printer using the gray data printing cable.
  • Press the “Print” key to manually initiate printing.

Computer Connection:

  • Connect the instrument and the computer using the black data transfer cable.
  • Set the super terminal parameters.
  • Start the text capture function.
  • Press the “Print” key to transmit data to a text file.

4.3 Batch Data Processing

  • Print all data: Select PRINT ALL in the SETUP menu.
  • Delete all data: Select ERASE ALL in the SETUP menu.
  • Data export: Transfer all data to a computer through the RS232 interface.

Chapter 5: Instrument Maintenance and Troubleshooting

5.1 Daily Maintenance Points

Cleaning and Maintenance:

  • Wipe the instrument’s outer shell with a damp cloth.
  • Promptly clean up any spilled reagents.
  • Clean the sample cell holder with a cotton swab.
  • Wipe the outer surface of the sample cell with lens paper or a soft, lint-free cloth.

Battery Management:

  • Replace low-battery cells promptly.
  • Remove the batteries if the instrument is not used for an extended period.
  • Reset the date and time after replacing the batteries.

Storage Conditions:

  • Storage temperature: -20 to 60°C
  • Relative humidity: Below 80% (at 40°C)
  • Avoid strong electromagnetic field environments.

5.2 Common Fault Exclusion

Error Codes and Solutions:

  1. Unable to set the instrument. Contact Hach customer service.
  2. Unable to read program data. Contact Hach customer service.
  3. Unable to write program data. Contact Hach customer service.
  4. Measurement battery error. Replace the batteries.
  5. Measurement A/D error. Contact Hach customer service.
  6. Measurement offset error. Check the installation of the light blocker.
  7. Low photometric intensity error. Check for light channel blockage or dilute the sample.
  8. Measurement value out of range. Confirm the installation of the instrument cover or contact customer service.

Other Common Problems:

  • Concentration out of range: Dilute the sample and re-measure.
  • Beep/error icon: Check the operational steps.
  • Low battery level: The LOW BATTERY icon is displayed. Replace the batteries promptly.

Chapter 6: Safety Regulations and Quality Assurance

6.1 Safety Operation Regulations

Hazard Levels:

  • Danger (DANGER): Situations that may lead to death or serious injury.
  • Caution (CAUTION): Situations that may lead to minor or moderate injury.
  • Note (NOTE): Information that requires special emphasis.

Key Safety Tips:

  • Review the Material Safety Data Sheet (MSDS) and be familiar with safety procedures when handling chemical samples.
  • The instrument should not be used for samples that are flammable or contain hydrocarbons.
  • Do not use Ni-Cd rechargeable batteries.
  • Do not open the instrument’s chassis without authorization.

6.2 Quality Assurance and Service Support

Quality Assurance:

  • Most products are guaranteed for at least one year from the shipping date.
  • The warranty covers defects in materials and manufacturing.

Repair Services:

  • Users should not attempt to repair any parts other than the batteries by themselves.
  • Contact an authorized Hach Company service center for repairs.

Chapter 7: Practical Application Tips and Experience Sharing

7.1 Best Practices for COD Measurement

Sample Handling Tips:

  • Ensure the sample is representative and mix it thoroughly before sampling.
  • Follow the digestion time and temperature requirements strictly.
  • Use reagents from the same batch for comparative measurements.

Methods to Reduce Errors:

  • Regularly verify the instrument’s accuracy using standard solutions.
  • Keep the sample tube clean.
  • Perform zero adjustment before each measurement.
  • Take the average of multiple measurements of the same sample.

7.2 Handling Special Application Scenarios

High-Salinity Sample Measurement:

  • May cause interference. It is recommended to conduct a spike recovery test.
  • Establish a specific calibration curve if necessary.

Low-Concentration Sample Measurement:

  • Use the low-range program to improve sensitivity.
  • Extend the measurement time or increase the sample volume.

Chapter 8: Instrument Verification and Compliance

8.1 Performance Verification Methods

Blank Test:

  • Measurement of ultrapure water should show 0mg/L COD.

Standard Sample Test:

  • Use COD standard solutions with known concentrations for verification.

Repeatability Test:

  • Measure the same sample multiple times and calculate the relative standard deviation.

Comparison Test:

  • Compare the results with standard methods or other instruments.

8.2 Compliance Certification

LED Safety:

  • Complies with EN60825-1 standard, Class 1 LED product.

Anti-Interference Characteristics:

  • Complies with EN 50082-1 general anti-interference standard.

EMC Electromagnetic Compatibility:

  • EN 61000-4-2 resistance to electrostatic discharge interference.
  • EN 61000-4-3 resistance to radiated RF electromagnetic field interference.
  • ENV 50204 resistance to digital telephone radiation.

Radio Frequency Emissions:

  • Complies with EN 55011 (CISPR 11) Class B emission limits.

Conclusion

The Hach DR1010 COD Determinator is a powerful and easy-to-use professional water quality analysis instrument. Through systematic learning of this guide, users should be able to master all the functions of the instrument, from basic operations to advanced applications. Correct operational methods and regular maintenance not only ensure the accuracy of measurement data but also extend the instrument’s service life. When encountering problems that cannot be resolved, users should promptly contact Hach Company’s professional technical service personnel to avoid improper operations that may cause instrument damage or data loss.

With the continuous improvement of environmental protection requirements, the importance of COD monitoring is becoming increasingly prominent. It is hoped that this guide will help users fully leverage the performance advantages of the DR1010 COD Determinator and provide reliable technical support for water quality monitoring and environmental protection work.

Posted on

User Manual and Operation Guide for Thermo Fisher FlashSmart Intelligent Elemental Analyzer (FlashSmart EA)

I. Instrument Overview and Basic Operations

1.1 Instrument Introduction

The Thermo Fisher FlashSmart Elemental Analyzer is a fully automated organic elemental analysis system that employs the dynamic combustion method (modified Dumas method) to determine nitrogen, carbon, hydrogen, and sulfur content. It measures oxygen content through high-temperature pyrolysis. This instrument can be configured with a single-channel or dual independent-channel system, and the MultiValve Control (MVC) module enables automatic dual-channel switching for analysis.

Main Technical Parameters:

  • Detector Type: Thermal Conductivity Detector (TCD)
  • Power Supply: 230V ± 10%, 50/60Hz, 1400VA
  • Dimensions: 50cm (height) × 59cm (width) × 58cm (depth)
  • Weight: 65kg
  • Maximum Operating Temperature: 1100℃
  • Gas Requirements: High-purity helium (carrier gas), oxygen (combustion aid), argon (for specific configurations)

1.2 Safety Precautions

Hazardous Operation Warnings:

  • High Voltage Risk: The instrument contains high-voltage components. Non-professionals are prohibited from opening the electrical compartment.
  • High-Temperature Surfaces: The furnace can reach temperatures up to 1100℃. Avoid contact during operation.
  • Gas Safety: Hydrogen use requires extreme caution, as concentrations as low as 4% pose an explosion risk.
  • Chemical Hazards: Wear protective gear when handling reaction tube packing materials and sample ashes.

Personal Protective Equipment (PPE) Requirements:

  • Eye Protection: Splash-resistant goggles
  • Hand Protection: White nitrile gloves (for chemicals)/heat-resistant gloves (for high-temperature operations)
  • Respiratory Protection: Dust masks
  • Body Protection: Lab coats + plastic aprons

1.3 Startup Preparation Procedure

Gas Connection:

  • Helium Inlet Pressure: 2.5bar (36psig)
  • Oxygen Inlet Pressure: 2.5-3bar (36-44psig)
  • Argon Inlet Pressure: 2.5bar (N/Protein configuration) or 4-4.5bar (NC Soils configuration)
  • Leak Testing: Perform on all gas lines.

Power Connection:

  • Confirm voltage stability at 230V ± 10%.
  • Ensure proper grounding; avoid sharing circuits with large motor equipment.

Software Installation:

  • System Requirements: Windows 7/8/10, at least 1GB hard drive space.
  • Install EagerSmart data processing software and drivers.

II. Calibration and Adjustment Procedures

2.1 Initial Setup

Hardware Configuration Steps:

  • Select Reaction Tube Configuration Based on Analysis Needs:
    • CHN Mode: Quartz reaction tube + chromium oxide/reduced copper/cobalt oxide packing.
    • CHNS Mode: Quartz reaction tube + copper oxide/electrolytic copper packing.
    • O Mode: Quartz reaction tube + nickel-plated carbon/quartz shavings packing.
    • N Mode: Dual reaction tubes in series + Plexiglas adsorption filter.
  • Install Autosampler:
    • MAS Plus Solid Autosampler: Up to 125-position sample tray.
    • AI 1310/AS 1310 Liquid Autosamplers: 8-position or 105-position sample trays.
  • Connect MVC Module (Dual-Channel Configuration):
    • Remove bypass panel from the rear.
    • Connect gas lines for left and right channels.
    • Configure dual MAS Plus autosamplers.

2.2 System Calibration

Three-Step Calibration Method:

  • Leak Testing:
    • Initiate automatic leak detection via software.
    • Acceptable Leak Rate: <0.1mL/min.
    • Use soapy water to locate leaks if detected.
  • Signal Baseline Adjustment:
    • Set TCD detector temperature constant (typically 40-120℃).
    • Adjust bridge voltage to 5V.
    • Baseline Drift: Should be <0.1mV/10min.
  • Standard Curve Establishment:
    • Use high-purity standards like acetanilide (nitrogen 16.09%, carbon 71.09%, hydrogen 6.70%).
    • Minimum Concentration Gradients: 5 points (recommended range: 0.1-5mg).
    • Correlation Coefficient (R²): Should be >0.999.

Calibration Frequency Recommendations:

  • Daily Use: Calibrate after each startup.
  • Continuous Analysis: Verify calibration every 50 samples.
  • After Consumable Replacement: Recalibration is mandatory.

2.3 Method Optimization

Parameter Adjustment Guidelines:

  • Oxygen Injection Time:
    • Regular Samples: 4-6 seconds.
    • Refractory Samples: Extend to 8 seconds.
    • High-Sulfur Samples: Add vanadium pentoxide as a combustion aid.
  • Furnace Temperature Settings:
    • Combustion Furnace: 950-1100℃.
    • Reduction Furnace: 840℃.
    • Pyrolysis Furnace (O Mode): 1060℃.
  • Carrier Gas Flow Rate:
    • Helium: 100-140mL/min.
    • Reference Gas: 30-50mL/min.

III. Routine Maintenance

3.1 Regular Maintenance Schedule

Maintenance Schedule Table:

Maintenance ItemFrequencyKey Operation Points
Reaction Tube RegenerationEvery 200 analysesEmpty packing material, incinerate at 550℃ for 2 hours.
Adsorbent ReplacementMonthlyActivate molecular sieve at 300℃, replace desiccant (silica gel) promptly.
Autosampler CleaningWeeklyUltrasonically clean tin/silver cups, inspect piston seals.
Chromatographic Column AgingQuarterlyAge at 280℃ with carrier gas for 8 hours.
Comprehensive System VerificationAnnuallyConducted by a professional engineer.

3.2 Key Component Maintenance

Reaction Tube Packing Guidelines:

  • Quartz Reaction Tubes:
    • Begin packing from the conical end.
    • Compact each layer with a dedicated tamping rod.
    • Separate layers with quartz wool.
    • Maintain total packing height at 80% of tube length.
  • HPAR Alloy Steel Reaction Tubes:
    • Must be used with crucibles.
    • Ensure uniform distribution of oxidation catalysts.
    • Use dedicated tools for installation/removal.

Adsorption Filter Maintenance:

  • Large (Plexiglas) Filters:
    • Packing sequence: Quartz wool → soda lime → molecular sieve → silica gel.
    • Pre-moisten soda lime with 0.5mL water.
  • Small (Pyrex) Filters:
    • Used in CHNS/O modes.
    • Packing: Quartz wool → anhydrous magnesium perchlorate.

3.3 Consumable Replacement Intervals

Recommended Replacement Intervals:

  • Quartz Wool: Replace when changing reaction tube packing.
  • Reduced Copper: Every 500 analyses.
  • Oxidation Catalyst: Every 300 analyses.
  • Nickel-Plated Carbon (O Mode): Every 150 analyses.
  • TCD Filament: Replace when baseline noise occurs.
  • Sealing O-Rings: Replace if leaks are detected or every 6 months.

IV. Troubleshooting and Solutions

4.1 Common Error Codes

Error Code Table:

CodeMeaningSolution
E01Left Furnace Temperature ExceededCheck thermocouple connection, restart system.
E04TCD Signal OverflowAdjust gain, verify carrier gas purity.
E12Safety Cutoff TriggeredCheck cooling fan, allow system to cool.
E25EFC-t Module Flow AbnormalityCheck for gas line blockages, clean filter.
E33Autosampler Communication FailureReconnect cables, verify port settings.

4.2 Typical Problem Resolution

Analysis Result Anomaly Investigation:

  • Low Nitrogen Results:
    • Check if reduced copper is失效 (discolored black).
    • Verify adequate oxygen injection.
    • Confirm complete sample combustion (observe flame).
  • Sulfur Peak Tailings:
    • Replace copper oxide packing layer.
    • Add vanadium pentoxide combustion aid.
    • Check chromatographic column connections for leaks.
  • Unstable Oxygen Results:
    • Verify nickel-plated carbon packing height (should be 60mm).
    • Confirm silver cup seal integrity.
    • Validate pyrolysis furnace temperature stability (±2℃).

Hardware Fault Handling:

  • Furnace Temperature Failure to Rise:
    • Check SSR solid-state relay status.
    • Measure transformer output voltage (should be 48V AC).
    • Confirm fuse integrity (AC 1112 board F1/F2).
  • Abnormal Gas Flow:
    • Clean EFC-t module filter.
    • Verify solenoid valve EV1-EV4 operation.
    • Calibrate flow sensors S1/S2.
  • TCD Baseline Drift:
    • Extend equilibration time to 2 hours.
    • Verify reference gas flow stability.
    • Replace aged filament.

4.3 Emergency Response Procedures

Safety Emergency Plan:

  • Gas Leak:
    • Immediately close cylinder main valve.
    • Activate laboratory ventilation system.
    • Avoid operating electrical equipment.
  • Furnace Overheating:
    • Trigger front panel emergency stop button.
    • Cut off main power supply.
    • Purge system with inert gas.
  • Abnormal Combustion:
    • Maintain system enclosure.
    • Direct exhaust through fume hood.
    • Do not cool directly with water.

V. Advanced Application Techniques

5.1 Special Sample Handling

Solutions for Challenging Samples:

  • High Inorganic Salt Samples:
    • Use quartz crucibles to prevent corrosion.
    • Reduce quartz wool between packing layers.
    • Increase oxygen injection pressure by 10%.
  • Volatile Liquids:
    • Utilize AI 1310 liquid autosampler.
    • Adsorb sample onto diatomaceous earth.
    • Preheat injection needle to 40℃.
  • Viscous Samples:
    • Grind with quartz sand for homogenization.
    • Use specially shaped tin cups.
    • Extend combustion time by 20%.

5.2 Data Quality Enhancement

Best Practice Recommendations:

  • Sample Preparation:
    • Homogenize to below 80 mesh.
    • Pre-dry samples with >5% moisture content.
    • Avoid fluorine-containing containers.
  • Weighing Techniques:
    • Use blank tin cups for calibration with microsamples (<1mg).
    • Employ “sandwich” loading method for highly volatile samples.
    • Utilize a 0.1μg precision balance.
  • Quality Control:
    • Insert standard samples every 10 analyses.
    • Maintain parallel sample deviation <1.5%.
    • Retain all original chromatograms.

5.3 Automation Features

Intelligent Function Applications:

  • Standby Mode:
    • Reduce carrier gas to 10mL/min.
    • Maintain furnace temperature at 50% of setpoint.
    • Auto-wake via timer function.
  • Sequence Analysis:
    • Supports 125-sample unattended operation.
    • Enables alternating method runs.
    • Auto-generates comprehensive reports.
  • Remote Monitoring:
    • View system status remotely via EagerSmart software.
    • Set up email alerts.
    • Auto-backup data to network.

VI. Appendices and Support

6.1 Technical Specifications Summary

Key Parameter Quick Reference Table:

  • Detection Limits: N/C/H 0.01%, S/O 0.02%
  • Precision: RSD <0.5% (for conventional elements)
  • Analysis Time: CHN 5min, O 4min, CHNS 6min
  • Sample Size: 0.01-100mg (solid), 0.1-10μL (liquid)
  • Gas Consumption: Approximately 10L helium per sample

6.2 Regulatory Compliance

Certifications and Compliance:

  • CE Certification: Complies with EN 61010-1 safety standards.
  • RoHS: Complies with Directive 2011/65/EU.
  • WEEE: Classification number 23103000.
  • GLP/GMP Compliance: Meets regulatory requirements.

This guide is based on the FlashSmart Elemental Analyzer Operating Manual (P/N 31707001, Revision E) and covers key points for the instrument’s operational lifecycle. Always adapt usage to specific configurations and application needs while strictly adhering to local safety regulations.

Posted on

Comprehensive User Guide for Thermo Fisher Orion 3106 COD Analyzer

I. Instrument Overview and Safety Precautions

1.1 Product Introduction

The Thermo Fisher Orion 3106 Chemical Oxygen Demand (COD) Online Automatic Monitor is a high-precision analytical device specifically designed for water quality monitoring. It is widely used in定点 (fixed-point) water quality monitoring at key pollution source wastewater discharge points and in water quality monitoring at the outlets of sewage treatment plants. This instrument employs a 450nm colorimetric testing principle, with a measurement range of 20 – 2000 mg/L COD and a minimum detection limit of 4 mg/L. The indication error is ±10% (tested with potassium hydrogen phthalate), meeting the stringent requirements of various water quality monitoring applications.

The instrument consists of two main parts: an electrical control system and a water sample analysis system. The electrical control system includes a power module, a circuit control system, and a user interaction panel, featuring functions such as power-on self-test and fault alarm. The water sample analysis system encompasses functions for water sample and reagent intake, water sample digestion, and measurement analysis. It utilizes syringe pumps for high-precision intake and implements precise temperature control to ensure complete and thorough digestion.

1.2 Safety Precautions

Before using the Orion 3106 COD Monitor, the following safety regulations must be strictly adhered to:

Electrical Safety:

  • Disconnect the power supply before performing maintenance or internal wiring on the instrument.
  • Do not operate the instrument with the safety panel or electrical cabinet door open.
  • All electrical connections must comply with local or national safety regulations.

Chemical Safety:

  • Wear protective gear (lab coat, protective goggles/face shield, protective gloves) before replacing reagents.
  • Work only in areas equipped with exhaust ventilation.
  • Use only glass or Teflon materials when handling chemicals.
  • Dispose of waste liquids (containing heavy metal ions such as silver, mercury, and chromium) in accordance with local regulations.

Operational Environment Safety:

  • Do not use the instrument in environments not specified in this manual.
  • Do not open the safety panels inside the equipment during operation.
  • Never use deionized water, drinking water, or beverages as a substitute for reagents to prevent explosion of the digestion tube.

Special Warnings:

  • The instrument may contain overheated components (up to 175°C) and high-pressure areas.
  • Various safety labels (electric shock warning, grounding warning, overheating warning, etc.) are affixed to the instrument. Carefully identify them before operation.

II. Instrument Installation and Initial Setup

2.1 Pre-installation Preparation

Unpacking Inspection:

  • Check the outer packaging for any visible damage. If found, report it to the shipping company.
  • Verify the product and accessories against the packing list. Immediately contact the Thermo Fisher representative office if any items are missing or damaged.

Installation Environment Requirements:

  • Operating temperature: 5°C to 40°C (recommended 20 ± 10°C).
  • Maximum humidity: 90% RH (recommended non-condensing).
  • Can be installed outdoors (IP66 protection rating), but avoid direct sunlight and ensure the diurnal temperature variation does not exceed ±10°C.
  • Install as close as possible to the sample source to minimize water sample analysis delay.
  • Avoid environments with irritating or corrosive gases.

2.2 Instrument Installation Steps

Installation Method Selection:

  • Wall mounting: Ensure the wall can withstand at least four times the weight of the instrument (approximately 40 kg).
  • Bracket mounting: Use the four M8 base screws provided with the instrument for fixation.

Space Requirements:

  • Reserve at least 700 mm of space on the right side for easy door opening.
  • Reserve sufficient space on the left side for piping and wiring.
  • The installation height should align the screen with the operator’s line of sight.
  • Ensure the instrument is level after installation (recommended to use a spirit level for adjustment).

Flow Cell Installation:

  • The flow cell must be installed in the lower left position of the instrument.
  • The installation position should be higher than the water level of the sampling pool.
  • Ensure the sampling tube is inserted into the flow cell and below the overflow level.
  • A 200-micron stainless steel filter screen must be installed and cleaned regularly.

Electrical Connection:

  • Power requirements: 100–240 VAC, 110 W, 50/60Hz.
  • Use a three-core power cord (minimum 0.75 mm²/18AWG) with a temperature resistance of ≥75°C.
  • It is recommended to install an external power switch or circuit breaker box (with leakage protection).

2.3 Tubing Connection and Reagent Preparation

Reagent System:

  • Prepare two types of reagents (Reagent 1 and Reagent 2) and 1 – 2 types of standard solutions.
  • Reagent bottle capacities: Reagent 1 (1000 mL), Reagent 2 (2000 mL), standard solution bottle (250 mL).
  • The tubing must be correctly inserted into the bottom of the corresponding reagent bottles, and ensure all bottle vents are unobstructed.

Waste Liquid System:

  • The waste liquid bucket should be no less than 25 L and placed below the instrument.
  • Three waste liquid tubes should be uniformly inserted into a single PVC main waste liquid tube with an inner diameter of 12 mm.
  • The waste liquid tubes should not be immersed in the waste liquid level to prevent back-suction.
  • Waste liquids should be treated as hazardous waste.

Deionized Water System:

  • The deionized water bucket should be no less than 18 L.
  • Water quality requirements: colorless and clear liquid with a resistivity > 0.5 MΩ·cm.

III. System Startup and Basic Operation

3.1 Initial Startup Procedure

Pre-power-on Inspection:

  • Confirm that the safety panel is installed.
  • Check that all tubing connections are correct.
  • Verify that reagents and deionized water are adequately prepared.

System Initialization:

  • After powering on, the instrument enters the initialization selection interface.
  • If the previous analysis process was forcibly stopped, it is recommended to select “Yes” to run initialization.
  • The “Auto Initialization” option in system management can be set to automatically complete this process.

Flow Path Priming:

  • Navigate to the menu: “Instrument Maintenance” > “Prime Solution” > “Prime All Tubing.”
  • The purpose is to expel air from the tubing and ensure normal subsequent analysis.

3.2 Operation Interface Explanation

Main Interface Display:

  • The most recent two measurement results (COD concentration values and measurement times).
  • The current status display area of the instrument.
  • The error or warning message display area.

Keyboard Function Definitions:

  • 【MENU】: Main interface key for quickly returning to the analysis results interface or the first-level menu.
  • 【RUN】: Run key for manually starting a test.
  • 【STOP】: Stop key for stopping the current test during operation.
  • 【ENTER】: Confirm key for parameter configuration or menu selection confirmation.
  • 【ESC】: Cancel operation key for returning to the previous menu.
  • Direction keys: For option movement or historical data page turning.
  • 【FUNC】: Function key for switching between large font/normal font display.

3.3 Menu Structure Overview

History Records:

  • View measurement results, calibration results, and other historical data.

Analysis Programs:

  • Verification, analysis, cleaning, pre-run, and post-run functions.

Parameter Settings:

  • Measurement parameters, calibration parameters, cleaning parameters, analysis parameters, etc.
  • System settings such as date and time, input and output, display, and communication.

Instrument Maintenance:

  • Maintenance functions such as priming, draining, precise calibration, and ordinary calibration.
  • Advanced options such as hardware settings and system management.

IV. Measurement Functions and Calibration

4.1 Measurement Parameter Settings

Analysis Mode Selection:

  • Manual mode: Starts one analysis each time the 【RUN】 key is pressed.
  • Automatic mode: Performs periodic continuous analysis with an adjustable analysis cycle.

Measurement Range Settings:

  • 20 – 200 mg/L: Suitable for low-concentration water samples.
  • 200 – 800 mg/L: Suitable for medium-concentration water samples.
  • 800 – 2000 mg/L: Suitable for high-concentration water samples.
  • Auto Range: Suitable for water samples with unknown or widely varying concentrations.

Analysis Parameter Settings:

  • Digestion temperature: Adjustable from 50 – 175°C.
  • Digestion time: Adjustable from 1 – 60 minutes.
  • Digestion cooling temperature: 40 – 80°C (recommended 65°C).
  • Measurement time setting mode: Manual fixed or automatic judgment.

4.2 Calibration Procedure

Calibration Parameter Settings:

  • Standard solution selection: 200 mg/L and/or 1000 mg/L.
  • Calibration range: Low, medium, high range, or combination.
  • Calibration mode: Manual or automatic (calibration cycle adjustable from 6 – 744 cycles).
  • Allowable deviation range: Default 10%.

Calibration Types:

  • Precise calibration: Each standard solution is run three times consecutively, and the average of the two closest values is taken.
  • Ordinary calibration: Each standard solution is run only once.

Calibration Execution Steps:

  • Enter the “Instrument Maintenance” menu and select the corresponding calibration type.
  • Follow the prompts to operate. The calibration parameters are automatically saved upon successful calibration.
  • Calibration results can be viewed in “History Records” > “Calibration Results.”

Verification Program:

  • Insert the hard tube of ERV port 7 into the standard water sample bottle to be verified.
  • Enter “Analysis Programs” > “Verification” to start the program.
  • After verification, the results and judgment are displayed (≤50 mg/L deviation ±5 mg/L is qualified, >50 mg/L deviation ±10% is qualified).

V. Maintenance and Troubleshooting

5.1 Regular Maintenance Plan

Customer Self-maintenance Items (Weekly/Monthly):

  • Check and replace reagents and standard solutions.
  • Clean and refill the deionized water bucket.
  • Empty the waste liquid bucket.
  • Clean the flow cell.

Professional Maintenance Items:

Maintenance CycleMaintenance Content
Every 6 monthsClean the measurement chamber, syringe, and replace sealing gaskets
Every 12 monthsReplace hose assemblies, clean the digestion tube, and replace O-rings
Every 24 monthsReplace the syringe, digestion tube, update all PTFE hard tubes and PVC waste liquid tubes

5.2 Common Fault Handling

Alarm Information Handling:

  • Blank signal abnormality:
    • Above upper limit: Recalibrate the optical path.
    • Below lower limit: Check the deionized water and tubing for contamination.
  • Measurement result out of limit:
    • Reselect the range according to the actual concentration or enable the Auto Range function.
  • Calibration problems:
    • Calibration out of limit: Check if the standard solution is contaminated and recalibrate.
    • Intercept too low: Check if the reagents are correct and recalibrate.

Error Information Handling:

  • No sample/reagent deficiency:
    • Check tubing connections, bottle liquid levels, and syringe sealing.
  • Syringe pump failure:
    • Use the instrument’s diagnostic function to check the pump status.
    • Check electrical connections and mechanical components.
  • Temperature-related problems:
    • Check the heating wire, digestion tube, and temperature sensor.
    • Recalibrate the temperature sensor.
  • Leakage alarm:
    • Immediately power off.
    • Locate the leakage source and repair it.
    • Wipe dry the tray and all leaked liquids.

5.3 Long-term Shutdown Handling

Run the drainage program; remove the safety panel and insert all tubing into deionized water; run the “Prime All Tubing” program; run the cleaning program; remove the tubing and expose it to the air, then run the priming and cleaning programs again; reinstall the safety panel and power off.

VI. Advanced Functions and Communication

6.1 Pre-run/Post-run Functions

Pre-run Settings:

  • Used to start external devices (such as pretreatment devices) before analysis.
  • Relay action and delay time (0 – 120 minutes) can be set.
  • Configured through the “Analysis Programs” > “Pre-run” menu.

Post-run Settings:

  • Used to start external devices after analysis.
  • Set in a similar manner to pre-run, with time calculated from the end of analysis.

6.2 Modbus Communication

Communication Settings:

  • Baud rate: Default 9600 (can be set to 19200).
  • Modbus slave address: Default 1 (can be changed).

Register Configuration:

  • Basic information: Address, protocol, pollutant type, etc.
  • Measurement data: Concentration, absorbance, status, etc.
  • Parameter settings: Range, cycle, temperature, etc.
  • Historical data: Calibration records, measurement records.

Remote Control:

  • Start calibration/measurement.
  • Emergency stop.
  • System initialization.
  • Time synchronization function.

6.3 Data Output

Analog Output:

  • Two 4 – 20 mA outputs (maximum load 900 Ω).
  • Can be set to correspond to the upper and lower limits of the range.
  • Can configure output values for error/warning/non-operation states.

Relay Output:

  • Seven dry contacts, 2A @ 250VAC.
  • Can set alarm thresholds (high/low points).

VII. Accessories and Customer Service

7.1 Accessory Information

Order NumberDescription
3106CODMain unit (without reagents)
3106RECReagent set (Reagent 1 + 2)
3106200200 mg/L COD standard solution
310610001000 mg/L COD standard solution
3106MK1212-month maintenance kit
3106MK2424-month maintenance kit

7.2 Customer Service

Warranty Terms:

  • 12 months after installation or 18 months after delivery (whichever comes first).
  • Consumables must be stored at 5 – 45°C and used within the shelf life.

Notes:

  • Returns must be authorized within 30 days.
  • Hazardous materials transportation requires special handling.
  • Expedited orders are subject to an additional fee.

VIII. Conclusion

The Thermo Fisher Orion 3106 COD Online Automatic Monitor, as a professional water quality analysis device, requires correct use and maintenance to obtain accurate and reliable monitoring data. Through the systematic introduction in this guide, users should be able to fully master:

Safety Regulations: Always prioritize safe operation and strictly adhere to electrical, chemical, and operational environment safety requirements.

Standardized Operation: Follow standard procedures for installation, startup, calibration, and measurement to ensure data accuracy.

Preventive Maintenance: Establish a regular maintenance plan to proactively prevent potential problems and extend equipment life.

Fault Handling Capability: Familiarize yourself with common alarm and error handling methods to improve problem-solving efficiency.

Advanced Applications: Fully utilize advanced functions such as pre-run/post-run and Modbus communication to achieve automated monitoring.

Correct use of the Orion 3106 COD Monitor not only provides accurate water quality data for environmental protection decision-making but also maximizes equipment performance and reduces operation and maintenance costs. It is recommended that users regularly participate in manufacturer-organized training and stay updated on the latest technical information to ensure the equipment is always in optimal working condition.

Posted on

Jenway 6800 Dual-Beam Spectrophotometer In-Depth Operation Manual Guide

I. Brand and Instrument Overview

Brand: Jenway (now part of the Cole-Parmer Group)

Instrument Model: Model 6800 Dual-Beam UV/Visible Spectrophotometer

Application Areas: Laboratory environments such as education, quality control, environmental analysis, and clinical analysis

Core Features:

  • Dual-Beam Design: Enhances optical stability and measurement accuracy.
  • Wide Wavelength Range: 190-1100nm, covering the ultraviolet to near-infrared spectrum.
  • Multifunctional Modes: Supports photometric measurements, multi-wavelength scanning, kinetic analysis, quantitative determination, and specialized protein/nucleic acid detection.
  • Modular Accessories: Compatible with various sample holders, including microplates, long-path cuvettes, and temperature-controlled circulation cells.

II. Core Content Analysis of the Operation Manual

1. Safety and Installation Specifications

Safety Warnings:

  • Only trained personnel should operate the instrument. Avoid contact with high-voltage components.
  • The operating environment should be free of corrosive gases, with a stable temperature (10-35°C) and humidity (45-85%).
  • Do not disassemble non-user-serviceable parts, as this will void the warranty.

Installation Steps:

  • Remove the light source protective foam after unpacking.
  • Use two people to lift the 27kg main unit to avoid dropping it.
  • Power requirements: 110-240V AC, grounded, and with stable voltage.

2. Software System Configuration

Flight Deck Software Installation:

  • Compatible with Windows 2000/XP/Vista, requiring a 1GHz CPU, 256MB RAM, and 500MB of hard disk space.
  • Install via CD, with the default installation path set to C:\Program Files\FlightDeck. A desktop shortcut is created after installation.

Instrument Connection:

  • Use an RS232 serial port or USB adapter to communicate with the computer.
  • Complete a self-check (approximately 1 minute) upon first startup.

3. Basic Operation Procedures

3.1 Photometric Measurement Mode (Photometrics)

Steps:

  • Parameter Settings: Select ABS/%T/Energy mode and set the wavelength (1-6 wavelengths).
  • Blank Calibration: Insert the blank solution and click “Blank Calibration” to automatically zero.
  • Sample Measurement: Replace with the sample to be tested and click “Measure” to record the data.
  • Data Processing: Supports export to Excel and can calculate absorbance ratios or differences.

3.2 Spectrum Scan Mode (Spectrum Scan)

Key Parameters:

  • Scan Speed: 10-3600nm/min.
  • Baseline Correction: Option for system baseline or user-defined baseline.

Advanced Features:

  • Peak/Valley Detection: Adjust detection accuracy via threshold and sensitivity settings.
  • Derivative Spectrum: Generate second-derivative spectra with one click.

3.3 Quantitative Analysis (Quantitation)

Calibration Curve: Supports 1-100 standard samples, with options for linear, quadratic, or piecewise fitting.
Example: For protein concentration determination, pre-stored calibration curves can be imported.
Path Correction: Applicable to non-10mm pathlength cuvettes, with automatic absorbance conversion by the software.

4. Specialized Application Modules

4.1 Nucleic Acid Analysis (DNA/RNA)

Calculation Formulas:

  • Concentration (μg/mL): = A260 × Conversion Factor (50 for dsDNA, 40 for RNA).
  • Purity Assessment: A260/A280 ratio.
    Notes: Enable A320 correction to eliminate turbidity interference.

4.2 Protein Detection

Method Selection:

  • Bradford Method: Detection at 595nm.
  • Lowry Method: Detection at 750nm.
  • Direct UV Method: Utilizes tyrosine absorption at 280nm without staining.
    Data Export: Supports generation of statistical reports with SD and CV.

5. Accessory Operation Guide

Temperature-Controlled Water Bath Cuvette Holder:

  • Remove the original holder and install the circulation water interface.
  • Set the water temperature and connect to an external temperature-controlled water bath.
  • Introduce dry gas to prevent condensation.

Micro-Volume Cuvette (50μL):

  • Use a dedicated holder, avoid bubbles during filling, and correct the pathlength to 10mm.

III. Maintenance and Troubleshooting

1. Daily Maintenance

Cleaning:

  • Sample Chamber: Wipe the window with isopropyl alcohol.
  • Cuvettes: Soak quartz cuvettes in hydrofluoric acid (for stubborn stains only); do not reuse plastic cuvettes.

Light Source Replacement:

  • Tungsten Lamp: Allow to cool for 20 minutes before replacement and reset the usage time.
  • Deuterium Lamp: Wear gloves and avoid touching the quartz window.

2. Common Issues

  • Baseline Drift: Check temperature stability or re-execute baseline correction.
  • Inaccurate Wavelength: Calibrate using the built-in holmium glass filter.
  • Communication Failure: Check the RS232 port configuration.

IV. Technical Parameter Quick Reference Table

ItemParameter Value
Wavelength Accuracy±0.3nm
Photometric Accuracy±0.002A (0-0.5A range)
Stray Light<0.05% (at 220nm)
Dimensions540×560×235mm

V. Original Usage Recommendations

Method Development Tips:

  • For high-concentration samples, use the “dilution factor” function to calculate the original concentration.
  • When performing multi-wavelength scans, enable “multi-file overlay” to compare samples from different batches.

Data Management:

  • Establish standardized naming conventions (e.g., “date_sample name_wavelength”) for easy traceability.

Compliance:

  • Regularly perform IQ/OQ validation (templates provided in the operation manual appendix).

Technical Support:

  • For further assistance, contact the Cole-Parmer official technical service team for customized solutions.
Posted on

Hach COD – 203 Online CODMn (Permanganate Index) Analyzer User Guide

I. Product Overview and Basic Principles

1.1 Product Introduction

The Hach COD – 203 online CODMn (permanganate index) analyzer is a precision instrument specifically designed for the automatic monitoring of the chemical oxygen demand (COD) concentration in industrial wastewater, river, and lake water bodies. Manufactured in accordance with the JIS K 0806 “Automatic Measuring Apparatus for Chemical Oxygen Demand (COD)” standard, this device employs fully automated measurement operations and adheres to the measurement principle of “Oxygen Consumption by Potassium Permanganate at 100°C (CODMn)” specified in the JIS K 0102 standard.

1.2 Measurement Principle

This analyzer utilizes the redox potential titration method to achieve precise determination of COD values through the following steps:

Oxidation Reaction: A定量 (fixed) amount of potassium permanganate solution is added to the water sample, which is then heated at 100°C for 30 minutes to oxidize organic and inorganic reducing substances in the water.
Residual Titration: An excess amount of sodium oxalate solution is added to react with the unreacted potassium permanganate, followed by titration of the remaining sodium oxalate with potassium permanganate.
Endpoint Determination: The mutation point of the redox potential is detected using a platinum electrode to calculate the amount of potassium permanganate consumed, which is then converted into the COD value.

1.3 Technical Features

  • Measurement Range: 0 – 20 mg/L to 0 – 2000 mg/L (multiple ranges available)
  • Measurement Cycle: 1 hour per measurement (configurable from 1 – 6 hours)
  • Flow Path Configuration: Standard configuration is 1 flow path with 1 range; optional 2 flow paths with 2 ranges
  • Measurement Methods: Supports acidic and alkaline methods (applicable to water samples with high chloride ion content)
  • Automation Level: Fully automated process including sampling, reagent addition, heating digestion, and titration calculation

II. Equipment Installation and Initial Setup

2.1 Installation Requirements

Environmental Requirements:

  • Temperature: 5 – 40°C
  • Humidity: ≤85% RH
  • Avoid direct sunlight, corrosive gases, and strong vibrations

Water Sample Requirements:

  • Temperature: 2 – 40°C
  • Pressure: 0.02 – 0.05 MPa
  • Flow rate: 0.5 – 4 L/min
  • Chloride ion limit: ≤2000 mg/L (for the 20 mg/L range)

Power and Water Supply:

  • Power supply: AC100V ± 10%, 50/60 Hz, maximum power consumption 550 VA
  • Pure water supply: Pressure 0.1 – 0.5 MPa, flow rate approximately 2 L/min

2.2 Equipment Installation Steps

Mechanical Installation:

  • Select a sturdy and level installation base.
  • Secure the equipment using four M12 × 200 anchor bolts.
  • Ensure the equipment is level and maintain a maintenance space of ≥1 m around it.

Pipe Connection:

  • Sampling pipe: Rc1/2 interface, recommended to use transparent PVC pipes (Φ13 or Φ16)
  • Pure water pipe: Rc1/2 interface, install an 80-mesh Y-type filter at the front end
  • Drain pipe: Rc1 interface, maintain a natural drainage slope of ≥1/50
  • Waste liquid pipe: Φ10 × Φ14.5 dedicated pipe, connect to a waste liquid container

Electrical Connection:

  • Power cable: 1.25 mm² × 3-core shielded cable
  • Grounding: Class D grounding (grounding resistance ≤100 Ω)
  • Signal output: Dual-channel isolated output of 4 – 20 mA/0 – 1 V

III. Reagent Preparation and System Preparation

3.1 Reagent Types and Preparation

Reagent 1 (Acidic Method):

  • Take 1000 g of special-grade silver nitrate.
  • Add pure water to reach a total volume of 5 L.
  • Store in a light-proof container and connect with a yellow hose.

Reagent 2 (Sulfuric Acid Solution):

  • Prepare 2 – 3 L of pure water in a container.
  • Slowly add 1.7 L of special-grade sulfuric acid (in 6 – 7 batches, with an interval of 10 – 20 minutes).
  • Add 5 mmol/L potassium permanganate dropwise until a faint red color is maintained for 1 minute.
  • Add pure water to reach 5 L and connect with a green hose.

Reagent 3 (Sodium Oxalate Solution):

  • Take 8.375 g of special-grade sodium oxalate (dried at 200°C for 1 hour).
  • Add pure water to reach 5 L and connect with a blue hose.

Reagent 4 (Potassium Permanganate Solution):

  • Dissolve 4.0 g of special-grade potassium permanganate in 5.5 L of pure water.
  • Boil for 1 – 2 hours, cool, and let stand overnight.
  • Filter and titrate to a concentration of 0.95 – 0.98.
  • Store in a 10 L light-proof container and connect with a red hose.

3.2 System Initial Preparation

Electrode Internal Solution Preparation:

  • Dissolve 200 g of potassium sulfate in 1 L of distilled water at 50°C to prepare a saturated solution.
  • Take the supernatant and dilute it with 1 L of distilled water.
  • Inject the solution into the comparison electrode container to fill one-third of its volume.

Heating Tank Oil Filling:

  • Inject approximately 500 mL of heat transfer oil through the hole in the heating tank cover.
  • The oil level should be between the two liquid level marks.

Pipe Flushing:

  • Open the sampling valve and pure water valve to expel air from the pipes.
  • Start the activated carbon filter (BV1 valve).
  • Set the flow rate to 1 L/min (PV7 valve).

IV. Detailed Operation Procedures

4.1 Power-On and Initialization

  • Turn on the power supply and confirm that the POWER indicator light is on.
  • Load the recording paper (76 mm wide thermal paper).
  • Perform Reagent 4 filling:
    • Enter the maintenance menu and select “Reagent 4 Injection/Attraction”.
    • Confirm that the liquid is purple and free of bubbles.

Preheating:

  • Check the heating tank temperature (INPUT screen).
  • The temperature must reach above 85°C before measurement can begin.

4.2 Calibration Procedures

Zero Calibration:

  • Enter the ZERO CALIB screen.
  • Set the number of calibrations (default is 3 times).
  • Start the calibration using activated carbon-filtered water.
  • Confirm that the calibration value is within the range of 0.100 – 2.500 mL.

Span Calibration:

  • Enter the SPAN CALIB screen.
  • Select the range (R1 or R2).
  • Use a 1/2 full-scale sodium oxalate standard solution.
  • Confirm that the calibration value is within the range of 4.000 – 8.000 mL.

Automatic Calibration Settings:

  • Parameter B07: Set the calibration cycle (1 – 30 days).
  • Parameter B08: Set the calibration start time.
  • Parameter B09: Set the date for the next calibration.

4.3 Routine Measurement

Main Interface Check:

  • Confirm that the “AUTO” status indicator light is on.
  • Check the remaining amounts of reagents and the status of the waste liquid container.

Start Measurement:

  • Select “SAMPLE” on the OPERATION screen.
  • The system will automatically complete the sampling, heating, and titration processes.

Data Viewing:

  • The DATA screen displays data from the last 12 hours.
  • The CURVE screen shows the titration curve shape.
  • Alarm information is集中 (centrally) displayed on the ALARM screen.

V. Maintenance Procedures

5.1 Daily Maintenance

Daily Checks:

  • Reagent and waste liquid levels.
  • Recording paper status and print quality.
  • Leakage in pipe connections.

Weekly Maintenance:

  • Activated carbon filter inspection.
  • Backflushing of the sampling pipe.
  • Solenoid valve operation test.

5.2 Regular Maintenance

Monthly Maintenance:

  • Cleaning and calibration of the measuring device.
  • Cleaning of the reaction tank and electrodes.
  • Replacement of control valve hoses.

Quarterly Maintenance:

  • Replacement of heating oil.
  • Inspection and replacement of pump diaphragms.
  • Comprehensive flushing of the pipe system.

Annual Maintenance:

  • Replacement of key components (electrodes, measuring devices, etc.).
  • Comprehensive calibration of system parameters.
  • Lubrication and maintenance of mechanical components.

5.3 Reagent Replacement Cycles

  • Reagent 1 (Silver Nitrate): Approximately 14 days/5 L
  • Reagent 2 (Sulfuric Acid): Approximately 14 days/5 L
  • Reagent 3 (Sodium Oxalate): Approximately 14 days/5 L
  • Reagent 4 (Potassium Permanganate): Approximately 14 days/10 L

VI. Fault Diagnosis and Handling

6.1 Common Alarm Handling

AL – L (Minor Fault):

  • Symptom: Automatic measurement continues.
  • Handling: Check the alarm content and press the ALLINIT key twice to reset.

AL – H (Major Fault):

  • Symptom: Measurement is suspended.
  • Typical Causes:
    • Abnormal heating temperature: Check the heater, SSR, and TC1 sensor.
    • Full waste liquid tank: Empty the waste liquid and check the FS2 switch.
    • Abnormal titration pump: Check the TP pump and SV16 valve.

6.2 Analysis of Abnormal Measurement Values

Data Drift:

  • Check the validity period and preparation accuracy of reagents.
  • Verify the response performance of electrodes.
  • Re-perform two-point calibration.

No Data Output:

  • Check the sampling system (pump, valve, filter).
  • Verify that parameter G01 = 1 (printer enabled).
  • Test the signal output line.

Large Data Deviation:

  • Perform manual comparison tests.
  • Adjust conversion parameters (D01 – D04).
  • Check the representativeness of sampling and pretreatment.

VII. Safety Precautions

7.1 Safety Sign Explanations

  • Warning: Indicates a serious hazard that may cause severe injury or death.
  • Caution: Indicates a general hazard that may cause minor injury or equipment damage.
  • Important: Key matters for maintaining equipment performance.

7.2 Safety Operation Procedures

Personal Protection:

  • Wear protective gloves and glasses when handling reagents.
  • Use a gas mask when handling waste liquid.

Chemical Safety:

  • Dilute sulfuric acid by adding “acid to water”.
  • Avoid contact between potassium permanganate and organic substances.
  • Store silver nitrate solution in a light-proof container.

Electrical Safety:

  • Do not touch internal terminals when the power is on.
  • Ensure reliable grounding.
  • Cut off the power supply before maintenance.

High-Temperature Protection:

  • The reaction tank reaches 100°C; allow it to cool before maintenance.
  • Heating oil may cause burns.

VIII. Technical Parameters and Appendices

8.1 Main Technical Parameters

  • Measurement Principle: Redox potential titration method
  • Measurement Range: 0 – 20 mg/L to 0 – 2000 mg/L (optional)
  • Repeatability: ≤±1% FS (for the 20 mg/L range)
  • Stability: ≤±3% FS/24 h
  • Output Signal: 4 – 20 mA/0 – 1 V
  • Communication Interface: Optional RS485/Modbus

8.2 Consumables List

Standard Consumables:

  • Printer ribbon (131F083)
  • Recording paper (131H404)
  • Silicone oil (XC885030)

Annual Consumables:

  • Pump diaphragm (125A114)
  • Control valve (126B831)
  • Activated carbon (136A075)

This guide comprehensively covers the operational key points of the Hach COD – 203 analyzer. In actual use, adjustments should be made based on specific water quality characteristics and site conditions. It is recommended to establish a complete equipment file to record each maintenance, calibration, and fault handling situation to ensure the long-term stable operation of the equipment.

Posted on

Technical Study on Troubleshooting and Repair of Mastersizer 3000: Air Pressure Zero and Insufficient Vacuum Issues

1. Introduction

The Mastersizer 3000 is a widely used laser diffraction particle size analyzer manufactured by Malvern Panalytical. It has become a key analytical tool in industries such as pharmaceuticals, chemicals, cement, food, coatings, and materials research. By applying laser diffraction principles, the instrument provides rapid, repeatable, and accurate measurements of particle size distributions.

Among its various configurations, the Aero S dry powder dispersion unit is essential for analyzing dry powders. This module relies on compressed air and vacuum control to disperse particles and to ensure that samples are introduced without agglomeration. Therefore, the stability of the pneumatic and vacuum subsystems directly affects data quality.

In practice, faults sometimes occur during startup or system cleaning. One such case involved a user who reported repeated errors during initialization and cleaning. The system displayed the following messages:

  • “Pression d’air = 0 bar” (Air pressure = 0 bar)
  • “Capteur de niveau de vide insuffisant” (Vacuum level insufficient)
  • “A problem has occurred during system clean. Press reset to retry”

While the optical laser subsystem appeared normal (laser intensity ~72.97%), the pneumatic and vacuum functions failed, preventing measurements.
This article will analyze the fault systematically, covering:

  • The operating principles of the Mastersizer 3000 pneumatic and vacuum systems
  • Fault symptoms and possible causes
  • A detailed troubleshooting and repair workflow
  • Case study insights
  • Preventive maintenance measures

The goal is to form a comprehensive technical study that can be used as a reference for engineers and laboratory technicians.


2. Working Principle of the Mastersizer 3000 and Pneumatic System

2.1 Overall Instrument Architecture

The Mastersizer 3000 consists of the following core modules:

  1. Optical system – Laser light source, lenses, and detectors that measure particle scattering signals.
  2. Dispersion unit – Either a wet dispersion unit (for suspensions) or the Aero S dry powder dispersion system (for powders).
  3. Pneumatic subsystem – Supplies compressed air to the Venturi nozzle to disperse particles.
  4. Vacuum and cleaning system – Provides suction during cleaning cycles to remove residual particles.
  5. Software and sensor monitoring – Continuously monitors laser intensity, detector signals, air pressure, vibration rate, and vacuum level.

2.2 The Aero S Dry Dispersion Unit

The Aero S operates based on Venturi dispersion:

  • Compressed air (typically 4–6 bar, oil-free and dry) passes through a narrow nozzle, creating high-velocity airflow.
  • Powder samples introduced into the airflow are broken apart into individual particles, which are carried into the laser measurement zone.
  • A vibrator ensures continuous and controlled feeding of powder.

To monitor performance, the unit uses:

  • Air pressure sensor – Ensures that the compressed air pressure is within the required range.
  • Vacuum pump and vacuum sensor – Used during System Clean cycles to generate negative pressure and remove any residual powder.
  • Electro-pneumatic valves – Control the switching between measurement, cleaning, and standby states.

2.3 Alarm Mechanisms

The software is designed to protect the system:

  • If the air pressure < 0.5 bar or the pressure sensor detects zero, it triggers “Pression d’air = 0 bar”.
  • If the vacuum pump fails or the vacuum sensor detects insufficient negative pressure, it triggers “Capteur de niveau de vide insuffisant”.
  • During cleaning cycles, if either air or vacuum fails, the software displays “A problem has occurred during system clean”, halting the process.

3. Fault Symptoms

3.1 Observed Behavior

The reported system displayed the following symptoms:

  1. Air pressure reading = 0 bar (even though external compressed air was connected).
  2. Vacuum insufficient – Cleaning could not be completed.
  3. Each attempt at System Clean resulted in the same error.
  4. Laser subsystem operated normally (~72.97% signal), confirming that the fault was confined to pneumatic/vacuum components.

3.2 Screen Snapshots

  • Laser: ~72.97% – Normal.
  • Air pressure: 0 bar – Abnormal.
  • Vacuum insufficient – Abnormal.
  • System Clean failed – Symptom repeated after each attempt.

4. Possible Causes

Based on the working principle, the issue can be classified into four categories:

4.1 External Compressed Air Problems

  • Insufficient pressure supplied (below 3 bar).
  • Moisture or oil contamination in the air supply leading to blockage.
  • Loose or disconnected inlet tubing.

4.2 Internal Pneumatic Issues

  • Venturi nozzle blockage – Powder residue, dust, or oil accumulation.
  • Tubing leak – Cracked or detached pneumatic hoses.
  • Faulty solenoid valve – Valve stuck closed, preventing airflow.

4.3 Vacuum System Issues

  • Vacuum pump not starting (electrical failure).
  • Vacuum pump clogged filter, reducing suction.
  • Vacuum hose leakage.
  • Defective vacuum sensor giving false signals.

4.4 Sensor or Control Electronics

  • Air pressure sensor drift or failure.
  • Vacuum sensor malfunction.
  • Control board failure in reading sensor values.
  • Loose electrical connections.

5. Troubleshooting Workflow

A structured troubleshooting approach helps isolate the problem quickly.

5.1 External Checks

  1. Verify that compressed air supply ≥ 4 bar.
  2. Inspect inlet tubing and fittings for leaks or loose connections.
  3. Confirm that a dryer/filter is installed to ensure oil-free and moisture-free air.

5.2 Pneumatic Circuit Tests

  1. Run manual Jet d’air in software. Observe if air flow is audible.
  2. If no airflow, dismantle and inspect the Venturi nozzle for blockage.
  3. Check solenoid valve operation: listen for clicking sound when activated.

5.3 Vacuum System Tests

  1. Run manual Clean cycle. Listen for the vacuum pump running.
  2. Disconnect vacuum tubing and feel for suction.
  3. Inspect vacuum filter; clean or replace if clogged.
  4. Measure vacuum with an external gauge.

5.4 Sensor Diagnostics

  1. Open Diagnostics menu in the software.
  2. Compare displayed sensor readings with actual measured pressure/vacuum.
  3. If real pressure exists but software shows zero → sensor fault.
  4. If vacuum pump works but error persists → vacuum sensor fault.

5.5 Control Electronics

  1. Verify power supply to pneumatic control board.
  2. Check connectors between sensors and board.
  3. If replacing sensors does not fix the issue, the control board may require replacement.

6. Repair Methods and Case Analysis

6.1 Air Supply Repairs

  • Adjust and stabilize supply at 5 bar.
  • Install or replace dryer filters to prevent moisture/oil contamination.
  • Replace damaged air tubing.

6.2 Internal Pneumatic Repairs

  • Clean Venturi nozzle with alcohol or compressed air.
  • Replace faulty solenoid valves.
  • Renew old or cracked pneumatic tubing.

6.3 Vacuum System Repairs

  • Disassemble vacuum pump and clean filter.
  • Replace vacuum pump if motor does not run.
  • Replace worn sealing gaskets.

6.4 Sensor Replacement

  • Replace faulty pressure sensor or vacuum sensor.
  • Recalibrate sensors after installation.

6.5 Case Study Result

In the real case:

  • External compressed air supply was only 1.4 bar, below specifications.
  • The vacuum pump failed to start (no noise, no suction).
  • After increasing compressed air supply to 5 bar and replacing the vacuum pump, the system returned to normal operation.

7. Preventive Maintenance Recommendations

7.1 Air Supply Management

  • Maintain external compressed air ≥ 4 bar.
  • Always use an oil-free compressor.
  • Install a dryer and oil separator filter, replacing filter elements regularly.

7.2 Routine Cleaning

  • Run System Clean after each measurement to avoid powder buildup.
  • Periodically dismantle and clean the Venturi nozzle.

7.3 Vacuum Pump Maintenance

  • Inspect and replace filters every 6–12 months.
  • Monitor pump noise and vibration; service if abnormal.
  • Replace worn gaskets and seals promptly.

7.4 Sensor Calibration

  • Perform annual calibration of air pressure and vacuum sensors by the manufacturer or accredited service center.

7.5 Software Monitoring

  • Regularly check the Diagnostics panel to detect early drift in sensor readings.
  • Record data logs to compare performance over time.

8. Conclusion

The Mastersizer 3000, when combined with the Aero S dry dispersion unit, relies heavily on stable air pressure and vacuum control. Failures such as “Air pressure = 0 bar” and “Vacuum level insufficient” disrupt operation, especially during System Clean cycles.

Through systematic analysis, the faults can be traced to:

  • External compressed air issues (low pressure, leaks, contamination)
  • Internal pneumatic blockages or valve faults
  • Vacuum pump failures or leaks
  • Sensor malfunctions or control board errors

A structured troubleshooting process — starting from external supply → pneumatic circuit → vacuum pump → sensors → electronics — ensures efficient fault localization.
In the reported case, increasing the compressed air pressure and replacing the defective vacuum pump successfully restored the instrument.

For laboratories and production environments, preventive maintenance is crucial:

  • Ensure stable, clean compressed air supply.
  • Clean and service nozzles, filters, and pumps regularly.
  • Calibrate sensors annually.
  • Monitor diagnostics to detect anomalies early.

By applying these strategies, downtime can be minimized, measurement accuracy preserved, and instrument lifespan extended.