文章--LeetCode算法--StringtoInteger(atoi)

StringtoInteger(atoi)

问题描述

Implement atoi to convert a string to an integer.

Requirements for atoi:
The function first discards as many whitespace characters as necessary until the first non-whitespace character is found. Then,
starting from this character, takes an optional initial plus or minus sign followed by as many numerical digits as possible, and
interprets them as a numerical value.
The string can contain additional characters after those that form the integral number, which are ignored and have no effect on the
behavior of this function.

If the first sequence of non-whitespace characters in str is not a valid integral number, or if no such sequence exists because either
str is empty or it contains only whitespace characters, no conversion is performed.
If no valid conversion could be performed, a zero value is returned. If the correct value is out of the range of representable values,
INT_MAX (2147483647) or INT_MIN (-2147483648) is returned.

实例

  • Example 1:

Input: "42"

Output: 42

  • Example 2:

Input: " -42"

Output: -42

Explanation: The first non-whitespace character is '-', which is the minus sign.

Then take as many numerical digits as possible, which gets 42.

  • Example 3:

Input: "4193 with words"

Output: 4193

Explanation: Conversion stops at digit '3' as the next character is not a numerical digit.

  • Example 4:

Input: "words and 987"

Output: 0

Explanation: The first non-whitespace character is 'w', which is not a numerical

digit or a +/- sign. Therefore no valid conversion could be performed.

  • Example 5:

Input: "-91283472332"

Output: -2147483648

Explanation: The number "-91283472332" is out of the range of a 32-bit signed integer.
Thefore INT_MIN (−231) is returned.

实现代码

  public class Solution {
        public int myAtoi(String str) {
            if (str == null || str.length() == 0)
                return 0;//
            str = str.trim();
            char firstChar = str.charAt(0);
            int sign = 1, start = 0, len = str.length();
            long sum = 0;
            if (firstChar == '+') {
                sign = 1;
                start++;
            }
            else if (firstChar == '-') {
                sign = -1;
                start++;
            }
            for (int i = start; i < len; i++) {
                if (!Character.isDigit(str.charAt(i)))
                    return (int)sum * sign;
                sum = sum * 10 + str.charAt(i) - '0';
                if (sign == 1 && sum > Integer.MAX_VALUE)
                    return Integer.MAX_VALUE;
                if (sign == -1 && (-1) * sum < Integer.MIN_VALUE)
                    return Integer.MIN_VALUE;
            }
            return (int)sum * sign;
        }
    }

posted @ 2019-07-18 14:10  AI,me  阅读(72)  评论(0编辑  收藏  举报