Overclock to 1.113ghz.. Anyone try this yet?

This is a discussion on Overclock to 1.113ghz.. Anyone try this yet? within the Nexus One forums, part of the Google Phones category; Risks to hardware are not constrained to voltage increases. Faster clocks result in more heat and the phone was designed to dissipate heat based on ...

Page 2 of 3 FirstFirst 123 LastLast
Results 11 to 20 of 25

Thread: Overclock to 1.113ghz.. Anyone try this yet?

  1. #11
    mah
    mah is offline
    Senior Member mah's Avatar
    Join Date
    Jan 2010
    Posts
    234
    Risks to hardware are not constrained to voltage increases. Faster clocks result in more heat and the phone was designed to dissipate heat based on its as-delivered clock speed. A faster clock will also result in a larger current draw from the battery, leading to (obvious) faster battery depletion and (perhaps not so obvious) more heat from the battery, adding to the extra heat from the CPU.

    "since its running at the same temp" -- I don't believe this statement. Nothing comes for free, and faster clocks directly relate to more current consumption and heat.

    Overclocking a phone: cool from a geek perspective, but not worthwhile otherwise, on a platform whose primary uses are I/O bound rather than CPU bound.

  2. #12
    Senior Member ezeddiekun's Avatar
    Join Date
    Jan 2010
    Posts
    776
    Quote Originally Posted by supsandroid View Post
    I read that the actual voltage doesn't increase at all so there is virtually no threat to the hardware since its running at the same temp.
    Temperature will increase. If you dont touch the voltage you can only overclock it in a minor manner. No hardward damage is BS.
    Froyo 2.2 Stock
    RA-nexus-v1.7.0
    [SIGPIC][/SIGPIC]

  3. #13
    Senior Member hawon's Avatar
    Join Date
    Dec 2009
    Posts
    170
    more than overclocking I'm really curious to see this running on Nexus One

    Myriad Dalvik Turbo hands-on: Android apps just got fast

  4. #14
    Member tech's Avatar
    Join Date
    Jan 2010
    Posts
    56
    How hot is too hot?

  5. #15
    Banned dogie's Avatar
    Join Date
    Jan 2010
    Posts
    4
    Quote Originally Posted by mah View Post
    Risks to hardware are not constrained to voltage increases. Faster clocks result in more heat and the phone was designed to dissipate heat based on its as-delivered clock speed. A faster clock will also result in a larger current draw from the battery, leading to (obvious) faster battery depletion and (perhaps not so obvious) more heat from the battery, adding to the extra heat from the CPU.

    "since its running at the same temp" -- I don't believe this statement. Nothing comes for free, and faster clocks directly relate to more current consumption and heat.

    Overclocking a phone: cool from a geek perspective, but not worthwhile otherwise, on a platform whose primary uses are I/O bound rather than CPU bound.
    As a hardware engineer, without getting complicated one can say that increasing clock speeds alone will literally do nothing in terms of power consumption and heat dissipation.

    With the core voltage staying the same (if it was not it would be a different story), the current drawn is the same, and the tiny tiny marginal 'friction' from the higher clock speed is negligible to such an extent as your ambient temperature will have a greater impact by a factor of several degrees.

    Oops edit:

    Also forgot to say, it is very likely power consumption may DECREASE. With the processor going faster with the same power, each action requires less CPU time, and so LESS power drawn. Again because the differences in heat and power from stock to overclocked are so marginal, one can usually expect to see an INCREASE in efficiency.



    Its a ****e graph, but it shows that up to a point (the efficiency threshold of the architecture), power consumption per performance unit CAN increase, if not stay the same.

  6. #16
    Junior Member spidermonkeyrob's Avatar
    Join Date
    Feb 2010
    Posts
    12
    Uh. Isn't the processor running at that frequency constantly? If so, what you said doesn't make sense at all. If it underclocked itself when not in use, I'd say there would be more merit to it.

  7. #17
    Senior Member ezeddiekun's Avatar
    Join Date
    Jan 2010
    Posts
    776
    Can you get to 1.113ghz without touching voltage? I can get from 2.4ghz to 2.9ghz without touching voltage on my computer =D but thats a whole different story
    Froyo 2.2 Stock
    RA-nexus-v1.7.0
    [SIGPIC][/SIGPIC]

  8. #18
    mah
    mah is offline
    Senior Member mah's Avatar
    Join Date
    Jan 2010
    Posts
    234
    Current consumption is directly related to the switching of gates. The faster your clock, the more frequently those gates switch. All other things remaining static, you cannot increase the clock without increasing the current consumption -- its basic design physics.

    If you change the hardware design to reduce the number of active gates, you can increase clock without increasing current (but this is still an apples-to-apples increase in current since under the same design change, the slower clock still results in smaller current draw).

    If you change the silicon to a smaller process you can reduce the keep the same current when using a faster clock, but here you're also using a smaller core voltage and thus again, it's an apples to oranges comparison.

    The only way one can measure reduced or even equal current over time when clock is increased is if the clock increase allows all CPU work to be done faster leading to perfect idle time (meaning no current consumption during idle time). It's possible to do this in the lab using carefully crafted benchmark applications to drive the CPU, but that renders you with a useless system.

    If current consumption did not increase when clock did, netbook and notebook vendors would provide faster clocks so people would be more likely to use those systems instead of desktops.

    If current consumption did not increase when clock did, mobile phones would not have been clocked at 100MHz even when the processors in them could support double or faster clocks.

    I've been working with mobile phone manufacturers for several years, and they all have the same requirements in their systems. These requirements have different priorities, but current consumption is at the top of all of their lists -- to the extent that if they can use a part that consumes 10uA instead of one that consumes 20uA, even at the expense of increasing software design burden, they will. They squeeze every bit of battery life they can, and these are the same manufacturers that conscientiously reduce CPU clock for this same reason.


    Quote Originally Posted by dogie View Post
    As a hardware engineer, without getting complicated one can say that increasing clock speeds alone will literally do nothing in terms of power consumption and heat dissipation.
    Being a hardware engineer doesn't mean you know much about embedded system design and current measurement. Quoting a PC analysis without regards to what software is (or, more importantly, is not) running does nothing to argue mobile device current consumption. Making a statement as you did without addressing gate switching = current consumption, and faster clock = more gate switching, suggests to me that if you're a hardware engineer, you spend more of your time in analog, not digital.

  9. #19
    Banned dogie's Avatar
    Join Date
    Jan 2010
    Posts
    4
    Quote Originally Posted by mah View Post
    Current consumption is directly related to the switching of gates. The faster your clock, the more frequently those gates switch. All other things remaining static, you cannot increase the clock without increasing the current consumption -- its basic design physics.

    If you change the hardware design to reduce the number of active gates, you can increase clock without increasing current (but this is still an apples-to-apples increase in current since under the same design change, the slower clock still results in smaller current draw).

    If you change the silicon to a smaller process you can reduce the keep the same current when using a faster clock, but here you're also using a smaller core voltage and thus again, it's an apples to oranges comparison.

    The only way one can measure reduced or even equal current over time when clock is increased is if the clock increase allows all CPU work to be done faster leading to perfect idle time (meaning no current consumption during idle time). It's possible to do this in the lab using carefully crafted benchmark applications to drive the CPU, but that renders you with a useless system.

    If current consumption did not increase when clock did, netbook and notebook vendors would provide faster clocks so people would be more likely to use those systems instead of desktops.

    If current consumption did not increase when clock did, mobile phones would not have been clocked at 100MHz even when the processors in them could support double or faster clocks.

    I've been working with mobile phone manufacturers for several years, and they all have the same requirements in their systems. These requirements have different priorities, but current consumption is at the top of all of their lists -- to the extent that if they can use a part that consumes 10uA instead of one that consumes 20uA, even at the expense of increasing software design burden, they will. They squeeze every bit of battery life they can, and these are the same manufacturers that conscientiously reduce CPU clock for this same reason.


    Quote Originally Posted by dogie View Post
    As a hardware engineer, without getting complicated one can say that increasing clock speeds alone will literally do nothing in terms of power consumption and heat dissipation.
    Being a hardware engineer doesn't mean you know much about embedded system design and current measurement. Quoting a PC analysis without regards to what software is (or, more importantly, is not) running does nothing to argue mobile device current consumption. Making a statement as you did without addressing gate switching = current consumption, and faster clock = more gate switching, suggests to me that if you're a hardware engineer, you spend more of your time in analog, not digital.
    Yes and if you read what I said, while your drawing more current per time unit, your needing to be running for less time, and they do cancel each other out as long as your within the limits of the architecture.

  10. #20
    mah
    mah is offline
    Senior Member mah's Avatar
    Join Date
    Jan 2010
    Posts
    234
    [quote=dogie;19700]
    Quote Originally Posted by mah View Post
    Yes and if you read what I said, while your drawing more current per time unit, your needing to be running for less time, and they do cancel each other out as long as your within the limits of the architecture.
    What's the difference between theory and reality? In theory, there isn't any difference. In reality, there's a huge difference.

    I read what you said. The problem is you're living within theory and your phone is living in reality. Reality: you're not going to slow down just because your task of interest is idle / stopped, your clock is going to keep on overclocking for between some period of time I cannot predict and infinity.

    Applications that run on a mobile phone are not going to be overwhelmingly CPU bound, they're going to be I/O bound. The clock is not going to come to a crawl every time you're blocked, waiting for I/O to complete, and by "not come to a crawl" I mean "going to stay overclocked".

    On a static system where you get to define exactly what's running and you can finely control clock speed, I will agree that current consumption (and heat dissipation) are not going to change appreciably based on clock speed. This makes an impossible assumption though: that you have nothing else to run other than your single benchmarking task, and that once that task is completed, nothing else will start running (until after we've taken our current measurement).

    Reality is a much more insidious environment.

Page 2 of 3 FirstFirst 123 LastLast

Remove Ads

http://www.scramblerducati.org/

Sponsored Links

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Android Forum