Announcement

Collapse
No announcement yet.

More scanner-and-printer shenanigans

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #46
    Originally posted by Spivonious View Post
    It's akin to the 1080i vs 1080p debate. One of them has twice as many pixels, but if you're sitting a certain distance away from the screen, the human eye physically cannot pick up on the difference.
    Not to derail but WTF are you talking about?
    The difference between 1080i and 1080p is how each line in the frame of an image is drawn. Both are 1920x1080. Same pixel count.

    The new kick now is refresh rates. 60/70Hz vs. 120Hz.
    Again... the human eye will barely notice the difference.

    In 1080i each frame of video is sent or displayed in alternative fields. The fields in 1080i are composed of 540 rows of pixels or lines of pixels running from the top to the bottom of the screen, with the odd fields displayed first and the even fields displayed second. Together, both fields create a full frame, made up of all 1,080 pixel rows or lines, every 30th of a second.

    In 1080p, each frame of video is sent or displayed progressively. This means that both the odd and even fields (all 1,080 pixel rows or pixel lines) that make up the full frame are displayed together. This results in a smoother looking image, with less motion artifacts and jagged edges.
    Back to your regularly scheduled PC bashing thread.
    -Rick

    Comment


    • #47
      In Newc's case he's blowing some of them up a ton, from 3" high to 36", so I'd say scan as high-res as possible. He's relying on the software to hold things together as he blows it up, and most sources say to blow up in 10% increments with as many passes as it takes in order to get the best interpolation.


      Originally posted by Spivonious View Post
      But DPI is always dots per inch regardless of whether the dot is a drop of printer ink or the dot is a pixel on the screen or the dot is a point in a bitmap. If he wants to print every pixel that he has scanned then he should match the two settings.
      Don't take my word for it, it's in the links I posted, plus a bunch more from the search results:

      "300 pixels per inch is generally regarded as the standard when preparing images for printing, yet current desktop inkjet printers are capable of printing at 1440, 2880 or even more dots per inch. While these sound similar, they aren't. In quoting printer resolutions that high, Epson, HP and other printer manufacturers are counting each and every one of those tiny ink droplets. Obviously doing so makes their printers sound more impressive, but don't be fooled into upsizing your image resolution to those astronomical values. If you have enough memory and disk space on your computer you certainly can print at 1440 pixels per inch, but your prints won't look any better for it. They'll just take a lot longer to print since you will be sending a heck of a lot more info to your printer. Drops this small mainly let them create color better and only have a small impact on actual resolution."

      "The number of pixels per inch in the image is NOT related to the droplets per inch
      produced by the printer.

      You never want to send a digitized image with a resolution of 2880 pixels per inch to a 2880 dots per inch printer. If you tried to make a 2880 pixels per inch image that is sized to 13 x 19 inches you would wind up with a 6 GB (Gigabyte or 1000 MB) image. In addition to that once the print driver got ahold of that image it would try to create a file of dots to send to the printhead and that file will be much greater than 6 GB. In fact for some reason known only to printers the resultant print would look worse than if you sent the printer a file with a resolution of 300 pixels per inch."

      Last edited by dg; 03-04-2009, 02:36 PM.

      Comment


      • #48
        Originally posted by rjohnstone View Post
        Not to derail but WTF are you talking about?
        The difference between 1080i and 1080p is how each line in the frame of an image is drawn. Both are 1920x1080. Same pixel count.
        You answered it yourself. Both are 1920x1080, but 1080i is really a 1920x540 picture with black lines inserted every other row. 1080p is a solid image.

        Both look exactly the same unless you're sitting close enough to see those black lines.

        As far as refresh rates, I can see flicker up to 85Hz, so moving from 60/70Hz to 120Hz would definitely help the image look brighter for me. Too bad the source material is only 24/30fps.
        Last edited by Spivonious; 03-04-2009, 02:43 PM.
        Scott

        Comment


        • #49
          Originally posted by dg View Post
          Don't take my word for it, it's in the links I posted, plus a bunch more from the search results:
          I read the links you posted. It just doesn't make sense, unless the printer manufacturers are counting each color dot separately instead of in groups of 3 like pixels are treated. Then you'd want to divide the screen dpi by 3 to get the matching printer dpi.

          So a scanned 1200dpi image would print best at 400dpi.
          Scott

          Comment


          • #50
            I'll try one more link before I give up. This one explains it better than either of the others. Wish I'd found it earlier:

            http://www.scantips.com/basics3b.html

            Comment


            • #51
              Originally posted by Spivonious View Post
              You answered it yourself. Both are 1920x1080, but 1080i is really a 1920x540 picture with black lines inserted every other row. 1080p is a solid image.

              Both look exactly the same unless you're sitting close enough to see those black lines.
              Where did I say any black lines were "inserted"?

              A single 1080i frame is built in two parts, 540 lines at a time.
              Odd lines first, then even. At no point are there any black lines in the picture as the even lines from the previous frame are still visible as the odd lines from the next frame are being drawn and so on.
              It's still a 1080 line frame.
              It's interlacing and was done to reduce signal bandwidth for broadcasting.

              Every cable, satellite and over the air HD signal is only broadcast in either 720p or 1080i using MPEG2 compression.
              1080p is MPEG4 and cannot be broadcast yet as the bandwidth requirement is rather large and current ATSC receivers cannot even decode it.
              1080p TV's will only display 1080p images from a device that does all the decoding before hand, i.e., gaming systems, DVD players or Blue Ray players. A few high end HDTV's will up convert a 1080i signal after it's been decoded if the TV is equipped to do such a thing. Most are not.
              I love watching the idiots drool all over their expensive 1080p TV's only to realize that the picture is still 1080i.

              Up converted 1080i ATSC signals are still not as clear as those from a true 1080p source.

              Originally posted by Spivonious View Post
              As far as refresh rates, I can see flicker up to 85Hz, so moving from 60/70Hz to 120Hz would definitely help the image look brighter for me. Too bad the source material is only 24/30fps.
              You may see flicker on a tube at 85Hz, but I highly doubt it on a flat panel.
              -Rick

              Comment


              • #52
                Originally posted by dg View Post
                I'll try one more link before I give up. This one explains it better than either of the others. Wish I'd found it earlier:

                http://www.scantips.com/basics3b.html
                Okay, so if the printer is dithering the image, I was right but I forgot about black ink. So divide the screen DPI by 4 and you get your printer DPI. Then once you account for ink bleeding, the resolution really is limited.
                Scott

                Comment


                • #53
                  Originally posted by rjohnstone View Post
                  Where did I say any black lines were "inserted"?

                  A single 1080i frame is built in two parts, 540 lines at a time.
                  Odd lines first, then even. At no point are there any black lines in the picture as the even lines from the previous frame are still visible as the odd lines from the next frame are being drawn and so on.
                  It's still a 1080 line frame.
                  It's interlacing and was done to reduce signal bandwidth for broadcasting.

                  Every cable, satellite and over the air HD signal is only broadcast in either 720p or 1080i using MPEG2 compression.
                  1080p is MPEG4 and cannot be broadcast yet as the bandwidth requirement is rather large and current ATSC receivers cannot even decode it.
                  1080p TV's will only display 1080p images from a device that does all the decoding before hand, i.e., gaming systems, DVD players or Blue Ray players. A few high end HDTV's will up convert a 1080i signal after it's been decoded if the TV is equipped to do such a thing. Most are not.
                  I love watching the idiots drool all over their expensive 1080p TV's only to realize that the picture is still 1080i.

                  Up converted 1080i ATSC signals are still not as clear as those from a true 1080p source.


                  You may see flicker on a tube at 85Hz, but I highly doubt it on a flat panel.
                  I understand interlacing I was just saying that at any one instant, only 540 lines are drawn on the TV. It just alternates fast enough to trick your brain into thinking there are 1080 lines at once. When the odds are drawn, the evens are black. When the evens are drawn, the odds are black.

                  Most 1080p TVs today do upconvert 1080i images by halving the refresh rate. Essentially they wait for the even lines to come through before drawing the odd lines and then draw them both at the same time. So upconverted 1080i should theoretically look just as good as a 1080p source. I fail to see why it wouldn't, but haven't seen an A/B comparison in real life as my TV is 720p (more than enough for 32").


                  ANYWAY! Back to scanning images and printing them!
                  Scott

                  Comment


                  • #54
                    Originally posted by rjohnstone View Post
                    Where did I say any black lines were "inserted"?

                    A single 1080i frame is built in two parts, 540 lines at a time.
                    Odd lines first, then even. At no point are there any black lines in the picture as the even lines from the previous frame are still visible as the odd lines from the next frame are being drawn and so on.
                    It's still a 1080 line frame.
                    It's interlacing and was done to reduce signal bandwidth for broadcasting.
                    You are absolutely correct. The "i" in 1080i means Interlaced. The easiest way to describe it, is that Interlaced has to scan the image on the screen twice to produce the full resolution where as Non-Interlaced or Progressive scan does it in one pass. That is also where the "p" in 1080p comes from. In an LCD picture, it is a lot harder to notice the difference as compared to a CRT Tube television as LCD uses crystal shutters and they will stay iluminated with the last image or frame untill it is updated.

                    Matt

                    Comment


                    • #55
                      Originally posted by Spivonious View Post
                      I understand interlacing I was just saying that at any one instant, only 540 lines are drawn on the TV. It just alternates fast enough to trick your brain into thinking there are 1080 lines at once. When the odds are drawn, the evens are black. When the evens are drawn, the odds are black.

                      Most 1080p TVs today do upconvert 1080i images by halving the refresh rate. Essentially they wait for the even lines to come through before drawing the odd lines and then draw them both at the same time. So upconverted 1080i should theoretically look just as good as a 1080p source. I fail to see why it wouldn't, but haven't seen an A/B comparison in real life as my TV is 720p (more than enough for 32").


                      ANYWAY! Back to scanning images and printing them!
                      There are 1080 lines of resolution on your screen at any one time. The 1080i refreshes 1/2 of them and then goes back and refreshes the other half of them (odd and even) at 30 times a second. At no point in the refresh cycle is there only 540 lines being displayed as if the current refresh is on a odd number of lines, the even numbered lines are still displaying the last scan.

                      Comment


                      • #56
                        Stop arguing in my thread!






                        Ok, so it's as I feared - printers and displays are still not on the same page with the whole DPI thing. I thought they addressed that a few years back, when I first started reading about all this. Seems that was on the table to be discussed, as mentioned in a MaximumPC article or 2 from the time.


                        So, it seems I'm doomed here

                        Thanks for the responses guys
                        I want to depart this world the same way I arrived; screaming and covered in someone else's blood

                        The most human thing we can do is comfort the afflicted and afflict the comfortable.

                        My Blog: http://newcenstein.com

                        Comment


                        • #57
                          Well your problem is you are scanning at the highest resolution possible and then scanning an entire page. Even if you have a 64 bit system with enough ram you are still working with an enormous file size. Making any changes to the picture will take forever if it even does it at all. Then you now have to send that file over a USB 2.0 cable to your printer.

                          When you get ready to print, depending on what you are using to print with, go into setting or print properties and find where it is telling you to shrink to fit on the page. Usually there is an indicator as to how much the image is being shrunk just to fit on the page. That will give you an indication of how much you are over on your DPI scans.

                          Matt

                          Comment

                          Working...
                          X