Day 39/100 100 Days of Code

Info Hunter

I fixed 2 critical bugs in the program.

The first was that the program was not scanning the correct URLs because I forgot to remove a test value in the request_info() method.

 // From
cpr::Response r = Scraper::request_info(Scraper::baseURL);

// To
cpr::Response r = Scraper::request_info(getUrls[counter]);

The other one was that it failed to count the number of keywords each URL has. To fix it, I removed the original code and used the std::find and std::count algorithms to find the correct number of keywords.

// From
for (const auto &url : getSettingsUrl)
            getUrls.push_back(url);
        }

        if (std::find(getUrls.begin(), getUrls.end(), url) != std::end(getUrls))
        {
            counter++;
            continue;
        }
        else
        {
            urlCounterHolder.push_back(counter);
            counter = 0;
            getUrls.push_back(url);
            counter++;
        }
    }

// To
for (const auto &url : getSettingsUrl)
    {
        if (urlCounterHolder.empty())
        {
            int count = (int)std::count(getSettingsUrl.begin(), getSettingsUrl.end(),
                                        url);
            urlCounterHolder.push_back((count));
            getUrls.push_back(url);
        }


        if (std::find(getUrls.begin(), getUrls.end(), url) != std::end(getUrls))
        {
            continue;
        }
        else
        {
            int count = (int)std::count(getSettingsUrl.begin(), getSettingsUrl.end(),
                                        url);
            urlCounterHolder.push_back((count));
            getUrls.push_back(url);
        }
    }

I am pretty happy with the current state of the application. I will have to move on to the next project for now and might return later add more features or make some fixies.

Video Demonstration