The term introduction of camera

1. Focal length: Focal length is a fundamental concept in optics and photography that describes the distance between the lens and the image sensor when the subject is in focus. It is measured in millimeters (mm) and plays a crucial role in determining the field of view, magnification, and perspective of an image;
2. Depth of field: Depth of field (DoF) refers to the range of distance within a photo that appears acceptably sharp and in focus. In other words, the front and back depth of the subject and the blur of the image on the film surface are all within the limited range of the allowable circle of confusion. It is a crucial concept in photography and videography, as it affects the overall aesthetic and compositional elements of an image;
3. Focus: Focusing refers to the process of adjusting the focal length of the lens so that the subject is clearly imaged on the sensor or film;
4. In focus: It refers to the state of the subject being imaged clearly. In other words, the subject or specific area being photographed is clear and sharp;
5. Out of focus: It refers to the state in which the subject is not clearly imaged. The subject or specific area of the subject appears blurry;
6. Focus lock: Focus lock means locking the current focus point. Even if the camera or the subject moves, the lens focus point remains unchanged;
7. Aperture: Aperture is a crucial element in photography and optics that refers to the opening in a camera lens through which light passes to reach the camera's sensor or film. The size of the aperture affects the amount of light that enters the camera, influencing exposure, depth of field, and overall image quality. The size of the aperture is expressed in f-stop numbers, such as f/1.4, f/2.8, f/5.6, f/11, etc;
8. Optical zoom: Optical zoom refers to the ability of a camera lens to magnify a subject by physically adjusting the lens elements to change the focal length. Unlike digital zoom, which simply crops and enlarges the image digitally, optical zoom maintains the full resolution and image quality because it uses the lens's optics to bring the subject closer;
9. Digital zoom: Digital zoom is a method used by cameras to increase the apparent size of an image, giving the impression of zooming in on a subject without physically changing the lens's focal length. Unlike optical zoom, which relies on the physical movement of lens elements to magnify the image, digital zoom uses software to achieve a similar effect;
10. Mixing zoom: Mixing zoom, often referred to as hybrid zoom, is a technology that combines both optical and digital zoom to achieve greater magnification while attempting to maintain higher image quality than digital zoom alone. This approach leverages the strengths of both methods to provide better results, especially in devices like smartphones where space constraints limit the extent of optical zoom;
11. Minimum distance zoom: It refer to the minimum focusing distance of a zoom lens, or it be related to the practical limitations of using zoom at close distances;
12. Motor temperature excursion: The phenomenon that the focus range of the focus motor changes as the temperature changes;
13. Long exposure: Long exposure is a photography technique where the camera's shutter remains open for an extended period, allowing more light to reach the camera's sensor or film. This technique is used to capture motion over time, resulting in a variety of creative effects that are not possible with shorter exposures;
14. Blur circle: A blur circle, also known as a circle of confusion, is a term used in photography to describe the appearance of out-of-focus points in an image. When a lens is not perfectly focused on a subject, the points of light that are out of focus are rendered as circles rather than points. These circles are what make up the blurred areas of an image;
15. ISO: ISO is a key component in digital photography and film photography that controls the sensitivity of the camera's sensor or film to light;
16. AE: AE, or Auto Exposure, is a feature in cameras that automatically adjusts the exposure settings to achieve a well-exposed image based on the lighting conditions;
17. Field of View: Field of View (FOV) is a term used in photography, videography, and optics to describe the extent of the observable world that can be seen through a camera or lens at any given moment. It essentially defines how much of a scene is captured in the frame;
18. Shutter: The shutter is a crucial component of a camera that controls the duration for which light is allowed to reach the camera's sensor or film. It plays a key role in determining the exposure of an image and affects how movement is captured in a photograph;
19. EV, or Exposure Value, is a numerical system used to quantify the exposure settings of a photograph. It simplifies the relationship between aperture, shutter speed, and ISO by providing a single value that represents the overall exposure level;
20. AWB: AWB stands for Auto White Balance. It is a feature in digital cameras and smartphones designed to automatically adjust the color balance of an image to ensure that white objects appear white and colors are accurate under varying lighting conditions;
21. White balance: White balance literally means the balance of white. White balance is an indicator that describes the accuracy of white generated by mixing the three primary colors of red, green and blue in the display. It is to achieve that the camera image can accurately reflect the color of the subject;
22. Color temperature: Color temperature is a measure of the hue or warmth of light, expressed in degrees Kelvin (K). It describes the color characteristics of light sources and affects the color balance of photographs;
23. Saturation: It means the degree of color vividness;
24. Grayscale: It means light intensity;
25. Lightness: Visual feeling to the degree of brightness of light source and object surface;
26. Frequency: In an image, low frequencies represent smooth areas of the image, and high frequencies represent edges or noise in the image;
27. Dynamic range: Dynamic range is a term used to describe the range of luminance or color values that a camera sensor, display, or other imaging system can capture or reproduce. It measures the difference between the darkest and lightest parts of an image or scene. The ratio of the brightest light signal value that the image can distinguish to the darkest light signal value that the image can distinguish is the formula mode. That is, the ratio of the brightest light signal value of the image to the maximum noise;
28. Pixel Crosstalk: The light entering one pixel is "muddy" when it reaches the neighboring pixel. High-end sensors use auxiliary micro lenses to bend the light again to correct this;
29. Ringing effect: The ringing effect can be referred to as the Gibbs phenomenon. It is caused by selecting an inappropriate image model in image restoration. The direct cause of the ringing effect is the loss of information during image degradation, especially the loss of high-frequency information;
30. Artifacts: It refers to the situation where the object being photographed does not exist but appears in various forms on the image. After image processing, especially in composite images, it appears as unnatural traces, areas, flaws, etc. that can be seen as artificial processing;
31. Jello effect: When shooting with a rolling shutter, the line-by-line scanning speed is not fast enough, and the shooting results appear tilted, swaying, or partially tilted;
32. Flicker: When the sensor acquires image data under fluorescent light as the light source, flickering occurs. The fundamental reason is that the light energy shining on different pixels is different. The difference in light energy received is the difference in image brightness. Since the CMOS sensor is exposed row by row, the exposure time of any pixel is the same, that is, the exposure start point and exposure time of each pixel on the same row are exactly the same, so the energy received by all points in the same row is the same. Although the exposure time is the same between different rows, the exposure start point is different, so the energy received by different rows is not necessarily the same. In order to make the energy received by different rows the same, a specific condition must be found so that even if the exposure start point is different, the light energy received by each row is the same, thus avoiding flickering. This specific condition is that the exposure time must be an integer multiple of the light energy cycle;
33. Pseudo-color: The interpolated image shows wrong colors or color fringes that did not appear in the original image;
34. Moiré: When the sampling frequency does not satisfy the sampling theorem, moiré will occur. In layman's terms, moiré is very likely to appear when the spatial frequency of the photosensitive element is close to the spatial frequency of the stripes in the image. Moiré is often accompanied by false color problems;
35. Contrast focus: The clearest point in the image is also the point with the largest contrast. The camera will drive the lens, change the focus point along the axis pointing to the subject, and obtain an image at each focus point, similar to point-by-point scanning. First, the image obtained at each focus point is digitized. The digitized image is actually an integer matrix and is passed to the image processor. Then the contrast is calculated, and the largest contrast is selected by comparison. The lens is driven to focus on the focus with the largest contrast value, that is, the correct focus is obtained, and the focus is determined based on the value of the largest contrast. That is, the focus is completed;
36. PDAF: Phase Detection Auto Focus, which reserves some masked pixels on the photosensitive element for phase detection. The distance between pixels and their changes are used to determine the focus offset value to achieve accurate focus;
37. Visible light: Wavelength range: light between 380-780nm;
38. Blur: Blur refers to the quality of the out-of-focus areas in the background and foreground of a photo. Many people think it is synonymous with background blur, but it is not. Instead, blur refers to how good the blurred background looks;
39. OIS: OIS is about compensating for tiny movements of the camera during exposure. In layman's terms, it uses a floating lens, gyroscopes, and small motors. These components are controlled by a microcontroller, which moves the lens very slightly to counteract camera shake - if the camera moves to the right, the lens moves to the left. This is the best option because all the stabilization is done mechanically, not in software, which means no quality is lost in the process;
40. EIS: Electronic image stabilization (EIS) is implemented in software. Essentially, what EIS does is divide the video into chunks and compare them to the previous frame. It then determines if the motion in the frame is natural or unwanted shaky motion and corrects for it. EIS typically reduces quality because it needs to leave space from the edges of the content to make the correction. However, it has improved over the past few years. Smartphone EIS typically takes advantage of gyroscopes and accelerometers to make it more precise with less quality loss;
41. HDR Photography: HDR achieves a balanced exposure across the entire frame. This is achieved by taking multiple photos at different shutter speeds. The idea is that each photo will be exposed for different light levels. The aggregate of images is then merged into a single photo with greater information about both the light and dark parts;
42. Bokeh: It refers to the fake blur (BOH-kay) effect produced by smartphones. Bokeh is a photographic effect where the subject of the photo remains in focus, while the background is out of focus. By using portrait mode to create a bokeh effect, you can take more professional-looking dynamic photos;
43.Super Resolution: Super resolution is the practice of taking and processing multiple low-resolution photos to create a higher resolution image. By taking multiple low-resolution photos and comparing these points in each photo, you have a solid foundation for a higher resolution photo. The essence is that there are subtle differences between these points, and algorithms or machine learning techniques are able to use these differences to fill in the gaps and create more details;
44. Lens Flare: When light from a strong light source (such as the sun or artificial light) reaches the camera lens directly, they reflect and bounce off different lens elements, apertures, and even sensors, potentially degrading image quality and creating unwanted objects in the image. This effect is called "lens flare," and it can affect images in a variety of ways: by introducing a haze of different colors, it can drastically reduce image contrast, it can add circular or semi-circular halos or "ghost images," and it can even add strange translucent shapes of various colors.