Skip to content

Latest commit

 

History

History

counting-bits

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 
 
 

< Previous                  Next >

Given an integer n, return an array ans of length n + 1 such that for each i (0 <= i <= n), ans[i] is the number of 1's in the binary representation of i.

 

Example 1:

Input: n = 2
Output: [0,1,1]
Explanation:
0 --> 0
1 --> 1
2 --> 10

Example 2:

Input: n = 5
Output: [0,1,1,2,1,2]
Explanation:
0 --> 0
1 --> 1
2 --> 10
3 --> 11
4 --> 100
5 --> 101

 

Constraints:

  • 0 <= n <= 105

 

Follow up:

  • It is very easy to come up with a solution with a runtime of O(n log n). Can you do it in linear time O(n) and possibly in a single pass?
  • Can you do it without using any built-in function (i.e., like __builtin_popcount in C++)?

Related Topics

[Dynamic Programming] [Bit Manipulation]

Similar Questions

  1. Number of 1 Bits (Easy)

Hints

Hint 1 You should make use of what you have produced already.
Hint 2 Divide the numbers in ranges like [2-3], [4-7], [8-15] and so on. And try to generate new range from previous.
Hint 3 Or does the odd/even status of the number help you in calculating the number of 1s?