Job-status

I have a simple PHP Crawler for single URL
it crawls and saves record into DB
Now we need a new Freelancer who has skills in PHP Crawling work
It should update the sour...Crawler for single URL
it crawls and saves record into DB
Now we need a new Freelancer who has skills in PHP Crawling work
It should update the source code to Crawl for Multiple URLs

...remove urls from the meta tag ?
I know this is a problem with vbulletin engine but i've already asked them about sorting this issue and they said it was out of their charge.
By the way, the 2 php files responsible of the meta description tags include it and they can be seen here :
[log ind for at se URL]

...PHP files or in .htaccess, the search results URLs, to get SEO Friendly URL Structure, on [log ind for at se URL]
Example: Instead of (before click):
[log ind for at se URL][]=1&data_location_1[country]=189
after click [log ind for at se URL]
I prefer get nice friendly URLs as: (before and after users click on the link

...
Under the box will be a button called "Get Title Tag Links"
The box will allow for up to 5000 URLs to be entered.
When someone enters in urls, and then clicks on the button, the tool will then go to all the urls listed and grab the title tag of each urls and will make the title tag the link to the page.
See the attachment.
I would like this tool done

Looking for someone to Rewrite the search results URLs to get SEO Friendly URL Structure
Example: Instead of (before click):
[log ind for at se URL][]=1&data_location_1[country]=189
after click [log ind for at se URL]
I prefer get nice friendly URLs as: (before and after users click on the link.)
matrimo.

...the websites (Forum & Blogs) or DoFollow or NoFollow.
You will get an excel file with URLs and you shall add Do Follow or No Follow in one collumn.
Please share your quote as per 100 URLs.
However, we will have more than 100 URLS. But we are still in the research phase, so we do not know how many URLs it will be in total for now.
!!!! NOTE! !!!

Looking for someone to Rewrite some URLs.
Example: Instead of (before click):
[log ind for at se URL][]=1&data_location_1[country]=189
after click [log ind for at se URL]
I prefer get nice friendly URLs as: (before and after users click on the link.)
[log ind for at se URL]
Another example:
Instead

...updated and sent to search engines.
The correct structure of the web is index/category/product. Redirect all other urls to this structure, not product?=tags, product?=search, index/product or anything like that. Only seo friendly urls with the structure index/category/product, the rest redirected and canonical.
In the sitemap the same, only the url

Hi,
I have a Prestashop instance at 1.5.5. The "friendly URLS" feature will not work and I need all pages to redirect to https:// . There is a valid cert in place. I need someone to get and sort these two issues out. Please show experience and credentials. I am happy to agree an hourly rate. I am an IT professional but just do not have

...modifications to our wordpress theme.
Our site is a "top list" type of site.
On the front there are links to external websites and links to reviews of the external websites.
We want on the Edit Post page in WP admin for each post (each external website is a post) be able to change if "nofollow" or not and /or if there should be a direct link to the

Hi~ I am new to this platform. And I want to scrape more than 2~3M of urls.
I already written a script that only runs on MacOS which use App Script.
This is essential condition. Currently only single machine can run a single script.
So if you are an expert of App Script or you have multiple number of MacOS then it's applicable on this project.

i will provide u list of urls,all urls are from same site,its a popular adds website from usa,i need phone numbers from those urls to excel
Need to be done using script no manual work please
the site has some security..only experts are required
i have attached the list of urls from where i want to extract phone numbers

It's a very simple task
1. Visit URL
2. Copy the IP address and result process number to google spreadsheet
Payment is 0,001$ per url (10 seconds to fill)
This is 0,1$ per 100 rows and results
This is the payment our client make and we can not increase

Hi, i need an SEO gun to restore old pages/URLs from an old website, to the new WordPress website. Using content found on Wayback Machine, content is to be copied across page by page. URLs are to be restored EXACTLY the way as the permalink used to be. Tools such as AHREFS and Search Console are to be used to identify 404 pages and restore lost links

This is a small project to find less than 100 specified product pages with hidden URLs meeting my description. I know the form of the URL, but we need to find the right range where these pages are, so you'll need software or programming skills to help. Thank you.

I will provide you with a list of websites I want you to scrape the business emails from. I then want you to create an Excel document and put the business name in one column and the email address in the other column. Total number of websites is around 6,000.

...contains list of valid urls - one url per line) and detect whether site is an e-commerce shop by trying to detect existence of shopping cart (this will require to manually test some adresses to provide as many tests using regexp or any other technique to discover this as accurately as possible - you will be provided with a starting list of urls). The output

On this said web application that are a few uris that once you perform a GET operation to them, they will be slow and can take up to 50 seconds of loading time. The most critical one is the landing page, the remaining will be disclosed once you bid to the project.

I want you to create an excel spreadsheet list of 4,000 websites from restaurants, and medium sized businesses in the Boston and greater Boston area. I want the website, the email address, and the company name in separate columns. I am looking to compile all of these emails for a mail merge.
The skills required are: Data processing, data entry, excel

I have over 100 WordPress url sites that need to be launched. Includes the registration and content population of each site, theme set up, and basically the monitoring of each over the initial launch periods. I seek a qualified, affordable English speaker that can walk me through this for the first ten sites. This will be done over a month or two and we go from there for the rest.
Please bid you...

...DETAILS OF THE JOB
I will provide you a list of article titles, they will be like, "Top 20 Tactical Belts In 2018" and "Top 20 Mens Ripped Jeans In 2018" and you are to go online and find the very best products for each article that are selling on well known online sites (Amazon, nordstrom, macys), into a list.
It is CRUCIAL that you have a strong und...

Hello,
I need someone who can write me a script in Python which will read excel file (URLs in it). Execute one by one and if the page loads, say 'its working'. If not, say 'not working' and saves this output in a new excel file next to the URL.
Let me know if you need more clarification.

Step 1 : User Clicks on my domain : [log ind for at se URL]
Step 2: User is redirected to [log ind for at se URL]
Step 3: User is further redirected to [log ind for at se URL]
Here i want the second URL to load properly buy must not be visible to anyone.
Hint: This script will be written in PHP through iFraming. But the iFrame tag must not be used in the entire script.

Using python, given a list of urls, the function creates an image grid an saves it in the desired path.
The grid must be dynamic, user must be able to specify row number and column number. The pictures must be placed in the same order as the list, row by row.

I will provide the freelancer with several web pages that I want him/her to scrape business web urls from. I will then want the freelancer to use these company websites to create an Excel document with the company name and contact email address in separate columns.

1) I have a custom plugin to manage questions from visitors. Each question generates a new page (similar to a post page). I need...manage questions from visitors. Each question generates a new page (similar to a post page). I need you to modify the plugin code, so it sets all question pages as NOINDEX, NOFOLLOW.
Please see the attached word document

i need a script to be done on python so it can crawl urls and save the page content (html code) in a db (mysql), this is the task
- see how the page should look [log ind for at se URL]
// should take less than 2h

...there are many.
I just loaded their page and an ESET warning came up saying content has been blocked and a video appeared over the page.
It needs the following doing please:
Find and remove the virus. screenshot shows the virus in action, it is intermittent, so not always showing. Google has yet to block this site, we don't have a google webmaster account

Hello,
I am looking for a rule that will hide all .php extensions and allow all the urls to have a trailing slash, Some of my urls also have - (dashes) in them so this will also need to be accounted for when creating the rule.
I have attempted:
RewriteCond /%{REQUEST_FILENAME}.php -f
RewriteRule ^(.*)/$ /$[log ind for at se URL]
Please do not access for server access

Industry: Mortgage Finance
FLuent English Speakers ONLY
Meta Description for SEO
Character Limit 150-160 characters
for 13 URLs
1-2 day turnaround time
Keywords that needs to be inserted into the descriptions will be provided

Here is the problem currently what I have.
The Image is visible on the web page, but I can not get the image URL from source view or something.
It's probably encrypted. on the source code, only sees <div> tags.
But on the Chrome Developer Tool -> Network view, I can see final requested image URL.
So, I want that final requested image URL on the page. not manually.
C# project using C...

...use it starting here:
[log ind for at se URL]
And I want you to create a seperate HTML that uses iframe or other related technique to smoothly find a way to include 2 external URLs in a simple way, similar to the screenshots attached:
[log ind for at se URL]
[log ind for at se URL]
Please include in your experience