Azure DevOps Server 2020.1 新增功能 (TFS)

1. 概述

自2005年开始,微软在VSS的基础上发布了TFS(2019年开始更名为Azure DevOps Server)的第一个版本TFS 2005,后续陆续发布了2008/2010/2012/2013/2015/2017/2018/2019/2020,每个版本都会给用户带来令人兴奋的功能。

同样,微软即将发布的Azure DevOps Server 2020 Update 1也为我们带来了一波与软件研发项目管理最新的功能,本文下面的内容就摘录这个最新版本的主要功能。

Boards


State transition restriction rules


We continue to close the feature parity gap between hosted XML and the inherited process model. This
new work item type rule lets you restrict work items from being moved from one state to another. For
example, you can restrict Bugs from going from New to Resolved. Instead, they must go from New –>
Active -> Resolved
You can also create a rule to restrict state transitions by group membership. For example, only users in
the “Approvers” group can move user stories from New -> Approved.

Copy work item to copy children


One of the top requested features for Azure Boards is the ability to copy a work item that also copies the
child work items. We added a new option to "Include child work items" to the copy work item dialog.
When selected, this option will copy the work item and copy all child work items (up to 100).

Improved rules for activated and resolved fields


Up until now, the rules for Activated By, Activated Date, Resolved By, and Resolved Date have been a
mystery. They are only set for system work item types and are specific to the state value of "Active" and
"Resolved". We've changed the logic so that these rules are no longer for a specific state. Instead, they
are triggered by the category (state category) that the state resides in. For example, let's say you have a
custom state of "Needs Testing" in the Resolved category. When the work item changes from "Active" to
"Needs Testing", the Resolved By and Resolved Date rules are triggered.
This allows customers to create any custom state values and still generate the Activated By, Activated
Date, Resolved By, and Resolved Date fields, without the need to use custom rules.

Stakeholders can move work items across board columns


Stakeholders have always been able to change the state of work items. But when they go to the Kanban
board, they're unable to move the work items from one column to another. Instead, Stakeholders would
have to open each work item, one at a time, and update the state value. This has long been a pain point
for customers, and we're happy to announce that you can now move work items across board columns.

System work item types on backlogs and boards


Now you can add a system work item type to the backlog level of choice. Historically these work item
types have only been available from queries.
Process Work Item Type
Agile Issue
Scrum Impediment
CMMI Change Request
Issue
Review
Risk
Adding a system work item type to a backlog level is simple. Just go into your custom process and
click the Backlog Levels tab. Select your backlog level of choice (example: Requirements Backlog)
and choose the edit option. Then add the work item type.

Audit logging event


We have added a new event to the audit logs to help customers better track process related changes.
An event will be logged whenever the values on a picklist are changed. Changes to picklist fields are
usually the most common changes made to a process. With this new event, collection admins can better
track when and who made changes to those fields.

Azure Boards GitHub app repo limit raised


The repo limit for the Azure Boards application in the GitHub marketplace has been increased from 100
to 250.

Customize work item state when pull request is merged


Not all workflows are the same. Some customers want to close out their related work items when a Pull
Request is completed. Others want to set the work items to another state to be validated before closing.
We should allow for both.
We have a new feature that allows you to set the work items to the desired state when the pull request
is merged and completed. To do this, we scan the pull request description and look for the state value
followed by the #mention of the work item(s). In this example, we are setting two user stories to
Resolved and closing two tasks.

Link your work item to builds in another project


You can now easily track your build dependencies across project just by linking your work item to a
Build, Found in build, or Integrated in build.

Editing description (help text) on system fields


You have always been able to edit the description of custom fields. But for system fields like priority,
severity, and activity, the description was not editable. This was a feature gap between the Hosted XML
and Inherited that prevented some customers from migrating to the Inherited model. You can now edit
the description on system fields. The edited value will only affect that field in the process and for that
work item type. This gives you the flexibility to have different descriptions for the same field on different
work item types.

Customize work item state when pull request is merged


Pull requests often refer to multiple work items. When you create or update a pull request, you may
want to close some of them, resolve some of them, and keep the rest open. You can now use comments
such as the ones shown in the figure below to accomplish that. See documentation for more details.

Parent field on the task board


Due to popular request, you can now add the Parent field to both the child and parent cards on the Task
Board.

Removing "Assigned To" rule on Bug work item type


There are several hidden system rules across all the different work item types in Agile, Scrum, and
CMMI. These rules have existed for over a decade and have generally worked fine without any
complaints. However, there are a couple of rules that have run out their welcome. One rule in particular
has caused a lot of pain for new and existing customers and we have decided it was time to remove it.
This rule exists on the Bug work item type in the Agile process.
"Set the Assigned value to Created By when state is changed to Resolved"
We received a lot of your feedback about this rule. In response, we went ahead and removed this rule
from the Bug work item type in the Agile process. This change will affect every project using an inherited
Agile or a customized inherited Agile process. For those customers who like and depend on this current
rule, please see our blog post on the steps you can take to re-add the rule in using custom rules.

Removed items on the Work Items page


The Work Items page is a great place to quickly find items you created or that are assigned to you,
amongst other things. It provides several personalized pivots and filters. One of the top complaints for
the "Assigned to me" pivot is that it displays Removed work items (that is, work items in the Removed
state). And we agree! Removed items are work that is no longer of value and thus has been removed
from the backlog, so including it in this view is not helpful.
You can now hide all Removed items from the Assigned to me pivot on the Work Items page.

Repos


Default branch name preference


Azure Repos now offers a customizable default branch name for Git. In repository settings, you may
choose any legal branch name to use when a repository is initialized. Azure Repos has always supported
changing the default branch name for an existing repository. Visit Manage branches for more details.
Note: if you don't enable this feature, your repositories will be initialized with Azure Repos's default
name. Right now, that default is master. To honor Microsoft's commitment to, and customer requests
for, inclusive language, we'll be joining industry peers in changing this default to main. That change will
occur later this summer. If you want to keep using master, you should turn on this feature now and set
it to master.

Org-level setting for default branch


There is now an collection-level setting for your preferred initial branch name for new repositories. If a
project has not chosen an initial branch name, then this org-level setting will be used. If you did not
specify the initial branch name in the org settings or the project settings, then new repositories will use
an Azure DevOps defined default.

Add a new auth scope for contributing PR comments


This release adds a new OAuth scope for reading/writing pull request comments. If you have a bot or
automation which only needs to interact with comments, you can give it a PAT with only this scope. This
process reduces the blast radius if the automation has a bug or if the token were compromised.

Pull Request experience improvements


The new pull request experience has been improved with the following:
• Make the optional checks more visible
Customers use optional checks to draw a developer's attention to potential issues. In the previous
experience, it used to be obvious when these checks fail. However, that is not the case in the preview
experience. A big, green checkmark on the required checks masks the failures in optional checks. Users
could only discover that optional checks failed by opening the checks panel. Developers don't often do
that when there is no indication of a problem. In this deployment, we made the status of optional
checks more visible in the summary.
• Ctrl-clicks on menu items
Tab menus on a PR didn't support Ctrl-click. Users often open new browser tabs as they review a pull
request. This has been fixed.
• Location of [+] annotation
The tree listing of files in a PR shows an annotation [+] to help authors and reviewers identify new files.
Since the annotation was after the ellipsis, it was often not visible for longer file names.
• PR updates dropdown regain timing information
The dropdown to select update and compare files in a PR lost an important element in the preview
experience. It didn't show when that update was made. This has been fixed.
• Improved comment filter layout
When filtering comments on the summary page of a pull request, the drop-down was on the right, but
the text was left-aligned. This has been fixed.

Navigation to parent commits


Under the Commits page, you can compare the changes made in a particular commit with its parent
commit. However, you may want to navigate to the parent commit and further understand how that
commit differs from its own parent. This is often needed when you want to understand all the changes
in a release. We added a parent(s) card to a commit to help you achieve this.

More space for folders and files with long names in the PR files tab


Folders and files with long names were cut off due to a lack of horizontal spacing in the file tree. We
recovered some additional space in the tree by modifying the tree’s indentation to match the root node
and by hiding the ellipsis button from the page except on hover.
Image of the new file tree:
Image of the file tree when hovering over a directory:

Preserve scroll position when resizing diff pane in PR files tab


When resizing the side-by-side diff pane in the PR files tab, the user’s scroll location would be lost. This
issue has been fixed; the user’s scroll location is now retained on a diff pane resize.

Search for a commit on a mobile device


When viewing the Commits page on a mobile device, the search box is missing in the new experience. As
a result, it is hard for you to find a commit by its hash and open it. This has been fixed now.

Improved usage of space for new PR file diff mobile view


We updated this page to make better use of the space so that users can see more of the file in mobile
views instead of having 40% of the screen taken up by a header.

Enhanced images in PR summary view


Images edited in a PR were not showing in the PR summary view but did show correctly in the PR files
view. This issue has been resolved.

Enhanced branch experience when creating a new PR


Before this update, this experience was not ideal as it would compare the changes with the default
branch of the repository instead of the compare branch.

Pipelines


Additional agent platform: ARM64


We added Linux/ARM64 to the list of supported platforms for the Azure Pipelines agent. Although the
code changes were minimal, a lot of behind-the-scenes work had to be completed first, and we're
excited to announce its release!

Tag filter support for pipeline resources


We have now added 'tags' in YAML pipelines. You can use tags to set the CI pipeline to run or when to
automatically trigger.
resources:
  pipelines:
  - pipeline: MyCIAlias
  project: Fabrikam
  source: Farbrikam-CI
  branch: master
  tags: ### This filter is used for resolving default version
  - Production ### Tags are AND'ed
  trigger:
  tags: ### This filter is used for triggering the pipeline run
  - Production ### Tags are AND'ed
  - Signed
The above snippet shows that tags can be used to determine the default version of the CI (continuous
integration) pipeline to run when the CD (continuous deployment) pipeline run is not triggered by some
other source/resource or a scheduled run trigger.
For instance, if you have a scheduled trigger set for your CD pipeline that you only want to run if your CI
has the production tag, the tags in the triggers section ensures that the CD pipeline is only triggered if
the tagging condition is met by the CI completion event.

Control which tasks are allowed in pipelines


You can now disable Marketplace tasks. Some of you may allow Marketplace extensions, but not the
Pipelines tasks they bring along. For even more control, we also allow you to independently disable all
in-the-box tasks (except checkout, which is a special action). With both of these settings enabled, the
only tasks allowed to run in a pipeline would be those uploaded using tfx.
Visit https://dev.azure.com/<your_org>/_settings/pipelinessettings and look for the section called "Task
restrictions" to get started.

Exclusive deployment lock policy


With this update, you can ensure that only a single run deploys to an environment at a time. By choosing
the "Exclusive lock" check on an environment, only one run will proceed. Subsequent runs which want
to deploy to that environment will be paused. Once the run with the exclusive lock completes, the latest
run will proceed. Any intermediate runs will be canceled.
:::image type="content" source="media/172-pipelines-0-0.png" alt-text="In the Add check page, select
Exclusive Lock to ensure that only a single run deploys to an environment at a time.":::

Stages filters for pipeline resource triggers


We added support for 'stages' as a filter for pipeline resources in YAML. With this filter, you don't need
to wait for the entire CI pipeline to be completed to trigger your CD pipeline. You can now choose to
trigger your CD pipeline upon completion of a specific stage in your CI pipeline.
resources:
  pipelines:
  - pipeline: MyCIAlias
  project: Fabrikam
  source: Farbrikam-CI
  trigger:
  stages: ### This stage filter is used when evaluating conditions for triggering your CD pipeline
  - PreProduction ### stages are AND'ed. On successful completion of all the stages provided, your
CD pipeline will be triggered.
  - Production
When the stages provided in the trigger filter are successfully completed in your CI pipeline, a new run is
automatically triggered for your CD pipeline.

Generic webhook based triggers for YAML pipelines


Today, we have various resources (such as pipelines, containers, build, and packages) through which you
can consume artifacts and enable automated triggers. Until now, however, you could not automate your
deployment process based on other external events or services. In this release, we are introducing
webhook trigger support in YAML pipelines to enable integration of pipeline automation with any
external service. You can subscribe to any external events through its webhooks (Github, Github
Enterprise, Nexus, Aritfactory, etc.) and trigger your pipelines.
Here are the steps to configure the webhook triggers:
1. Setup a webhook on your external service. When creating your webhook, you need to provide
the following info:
o Request Url -
"https://dev.azure.com//_apis/public/distributedtask/webhooks/<WebHook
Name>?api-version=6.0-preview"
o Secret - This is optional. If you need to secure your JSON payload, provide
the Secret value
2. Create a new "Incoming Webhook" service connection. This is a newly introduced Service
Connection Type that will allow you to define three important pieces of information:
o Webhook Name: The name of the webhook should match webhook created in your
external service.
o HTTP Header - The name of the HTTP header in the request that contains the payload
hash value for request verification. For example, in the case of the GitHub, the request
header will be "X-Hub-Signature"
o Secret - The secret is used to parse the payload hash used for verification of the
incoming request (this is optional). If you have used a secret in creating your webhook,
you will need to provide the same secret key
:::image type="content" source="media/172-pipelines-0-1.png" alt-text="In the Edit service connection
page, configure webhook triggers.":::
3. A new resource type called webhooks is introduced in YAML pipelines. For subscribing to a
webhook event, you need to define a webhook resource in your pipeline and point it to the
Incoming webhook service connection. You can also define additional filters on the webhook
resource based on the JSON payload data to further customize the triggers for each pipeline,
and you can consume the payload data in the form of variables in your jobs.
resources:
  webhooks:
  - webhook: MyWebhookTrigger ### Webhook alias
  connection: MyWebhookConnection ### Incoming webhook service connection
  filters:
  - path: repositoryName ### JSON path in the payload
  value: maven-releases ### Expected value in the path provided
  - path: action
  value: CREATED
steps:
- task: PowerShell@2
  inputs:
  targetType: 'inline'
  ### JSON payload data is available in the form of ${{ parameters.<WebhookAlias>.<JSONPath>}}
  script: |
  Write-Host ${{ parameters.MyWebhookTrigger.repositoryName}}
  Write-Host ${{ parameters.MyWebhookTrigger.component.group}}
4. Whenever a webhook event is received by the Incoming Webhook service connection, a new
run will be triggered for all the pipelines subscribed to the webhook event.

YAML resource trigger issues support and traceability


It can be confusing when pipeline triggers fail to execute as you expect them to. To help better
understand this, we've added a new menu item in the pipeline definition page called 'Trigger Issues'
where information is surfaced regarding why triggers are not executing.
Resource triggers can fail to execute for two reasons.
1. If the source of the service connection provided is invalid, or if there are any syntax errors in the
trigger, the trigger will not be configured at all. These are surfaced as errors.
2. If trigger conditions are not matched, the trigger will not execute. Whenever this occurs, a
warning will be surfaced so you can understand why the conditions were not matched.
:::image type="content" source="media/172-pipelines-0-2.png" alt-text="This pipeline definition page
called Trigger Issues displays information regarding why triggers are not running.":::

Banner for live site incidents impacting pipelines


We've added a warning banner to the pipelines page to alert users of ongoing incidents in your region,
which may impact your pipelines.

Pipelines images announcements


[!NOTE] We’re constantly working to improve your experience using Azure Pipelines. To learn more
about upcoming updates on our windows/linux/mac images updates please check here:
• Window - https://github.com/actions/virtualenvironments/blob/main/images/win/announcements.md
• Linux - https://github.com/actions/virtualenvironments/blob/main/images/linux/announcements.md

Multi-repo triggers


You can specify multiple repositories in one YAML file and cause a pipeline to trigger by updates to any
of the repositories. This feature is useful, for instance, in the following scenarios:
• You consume a tool or a library from a different repository. You want to run tests for your
application whenever the tool or library is updated.
• You keep your YAML file in a separate repository from the application code. You want to trigger
the pipeline every time an update is pushed to the application repository.
With this update, multi-repo triggers will only work for Git repositories in Azure Repos. They don't work
for GitHub or BitBucket repository resources.
Here is an example that shows how to define multiple repository resources in a pipeline and how to
configure triggers on all of them.
trigger:
- main
resources:
  repositories:
  - repository: tools
  type: git
  name: MyProject/tools
  ref: main
  trigger:
  branches:
  include:
  - main
  - release
The pipeline in this example will be triggered if there are any updates to:
• main branch in the self repo containing the YAML file
• main or release branches in tools repo
For more information, see Multiple repositories in your pipeline.

Agent log uploads improved


When an agent stops communicating with the Azure Pipelines server, the job it was running becomes
abandoned. If you happened to be looking at the streaming console logs, you might have gotten some
clues about what the agent was up to right before it stopped responding. But if you weren't, or if you
refreshed the page, those console logs were gone. With this release, if the agent stops responding
before it sends up its full logs, we'll keep the console logs as the next-best thing. Console logs are limited
to 1000 characters per line and can occasionally be incomplete, but they're a lot more helpful than
showing nothing! Examining these logs may help you troubleshoot your own pipelines, and it will
certainly help our support engineers when they assist with troubleshooting.

Optionally mount container volumes read-only


When you run a container job in Azure Pipelines, several volumes containing the workspace, tasks, and
other materials are mapped as volumes. These volumes default to read/write access. For increased
security, you can mount the volumes read-only by altering your container specification in YAML. Each
key under mountReadOnly can be set to true for read-only (the default is false).
resources:
  containers:
  - container: example
  image: ubuntu:18.04
  mountReadOnly:
  externals: true
  tasks: true
  tools: true
  work: false

Fine-grained control over container start/stop


In general, we recommend that you allow Azure Pipelines to manage the lifecycle of your job and service
containers. However, in some uncommon scenarios, you may want to start and stop them yourself. With
this release, we've built that capability into the Docker task.
Here's an example pipeline using the new capability:
resources:
  containers:
  - container: builder
  image: ubuntu:18.04
steps:
  - script: echo "I can run inside the container (it starts by default)"
  target:
  container: builder
  - task: Docker@2
  inputs:
  command: stop
  container: builder
# if any step tried to run in the container here, it would fail
Also, we include the list of containers in a pipeline variable, Agent.ContainerMapping. You can use this if
you want to inspect the list of containers in a script, for example. It contains a stringified JSON object
mapping the resource name ("builder" from the example above) to the container ID the agent manages.

Unzip task bundles for each step


When the agent runs a job, it first downloads all the task bundles required by the job's steps. Normally,
for performance, it unzips the tasks once per job even if the task is used in multiple steps. If you have
concerns about untrusted code altering the unzipped contents, you can trade away a little bit of
performance by having the agent unzip the task on each usage. To enable this mode, pass --
alwaysextracttask when configuring the agent.

Improve release security by restricting scope of access tokens


Building upon our initiative to enhance security settings for Azure Pipelines, we now support restricting
scope of access tokens for releases.
Every job that runs in releases gets an access token. The access token is used by the tasks and by your
scripts to call back into Azure DevOps. For example, we use the access token to get source code,
download artifacts, upload logs, test results, or to make REST calls into Azure DevOps. A new access
token is generated for each job, and it expires once the job completes.
With this update, we build upon improve pipeline security by restricting the scope of access tokens and
extend the same to classic releases.
This feature will be on by default for new projects and collections. For existing collections, you must
enable it in collection Settings > Pipelines > Settings. > Limit job authorization scope to current project
for release pipelines. Learn more here.

YAML preview API enhancements


You can now preview the complete YAML of a pipeline without running it. In addition, we've created a
dedicated new URL for the preview capability. Now you can POST
to https://dev.azure.com/{collection}/{project}/_apis/pipelines/{pipelineId}/preview to retrieve the
finalized YAML body. This new API takes the same parameters as queuing a run, but no longer requires
the "Queue builds" permission.

Run this job next


Have you ever had a bugfix which you needed to deploy right this minute but had to wait behind CI and
PR jobs? With this release, we now allow you to bump the priority of a queued job. Users with the
"Manage" permission on the pool - typically pool administrators - will see a new "Run next" button on
the job details page. Clicking the button will set the job to be run as soon as possible. (You'll still need
available parallelism and a suitable agent, of course.)

Template expressions allowed in YAML resources block


Previously, compile-time expressions (${{ }}) were not allowed in the resources section of an Azure
Pipelines YAML file. With this release, we have lifted this restriction for containers. This allows you to
use runtime parameter contents inside your resources, for example to pick a container at queue time.
We plan to extend this support to other resources over time.

Control over automated task updates from Marketplace


When you write a YAML pipeline, normally you specify only the major version number of the included
tasks. This allows your pipelines to automatically take the latest feature additions and bug fixes.
Occasionally you may need to roll back to a previous point release of a task, and with this update, we
added the ability for you to do so. You may now specify a full major.minor.patch task version in your
YAML pipelines.
We don't recommend that you do this regularly, and use it only as a temporary workaround when you
find that a newer task breaks your pipelines. Also, this will not install an older version of a task from the
Marketplace. It must already exist in your collection or your pipeline will fail.
Example:
steps:
- task: MyTask@1 # a normal, major-version only reference
- task: MyOtherTask@2.3.4 # pinned to a precise version

Node 10 support in agent and tasks


Since Node 6 is no longer supported, we are migrating the tasks to work with Node 10. For this update,
we have migrated nearly 50% of in-box tasks to Node 10. The agent can now run both Node 6 and Node
10 tasks. In a future update, we will entirely remove Node 6 from the agent. To prepare for the removal
of Node 6 from the agent, we request that you update your in-house extensions and custom tasks to
also use Node 10 soon. To use Node 10 for your task, in your task.json, under execution, update
from Node to Node10.

Improve YAML conversion in the classic build designer


With this release, we introduce a new "export to YAML" feature for designer build pipelines. Save your
pipeline definition, then find "Export to YAML" on the ... menu.
The new export function replaces the "View as YAML" function previously found in the classic build
designer. That function was incomplete as it could only inspect what the web UI knew about the build,
which sometimes led to incorrect YAML being generated. The new export function takes into account
exactly how the pipeline will be processed and generates YAML with full fidelity to the designer pipeline.

New web platform conversion – Repository settings


We have converted the two Repository settings pages to a single experience that was upgraded to a
new web platform. This upgrade not only makes the experience faster and more modern, but these
pages also provide a single entry-point for all policies from the project level to the branch level.
With this new experience, navigation for projects with a substantial number of repositories has become
easier because of faster load times and an added search filter. You can also view project level policies
and the list of cross-repo policies under the Policies tab.
If you click into a repository, you can view policies and permissions set at the repository level. Within the
policies tab, you can view a list of every branch that policy is set on. Now, click on the branch to see the
policies all while never leaving the Repository settings page.
Now, when policies are inherited from a higher scope than what you are working with, we show you
where the policy was inherited from next to each individual policy. You can also navigate to the page
where the higher-level policy was set by clicking the scope name.
The policy page itself has also been upgraded to the new web platform with collapsible sections! To
improve the experience of looking for a particular Build Validation, Status Check, or Automatic Reviewer
policy, we have added search filters for each section.

ServiceNow change management integration with YAML pipelines


The Azure Pipelines app for ServiceNow helps you integrate Azure Pipelines and ServiceNow Change
Management. With this update, we take our journey of making Azure Pipelines aware of the change
management process managed in ServiceNow further to YAML pipelines.
By configuring the "ServiceNow Change Management" check on a resource, you can now pause for the
change to be approved before deploying the build to that resource. You can automatically create a new
change for a stage or wait on an existing change request.
You can also add the UpdateServiceNowChangeRequest task in a server job to update the change
request with deployment status, notes etc.

Artifacts


Ability to create org-scoped feeds from UI


We are bringing back the ability for customers to create and manage collection-scoped feeds through
the web UI for both on-prem and hosted services.
You can now create org-scoped feeds via the UI, by going to Artifacts -> Create Feed and choosing a type
of feed within Scope.
:::image type="content" source="media/172-artifacts-0-0.png" alt-text="Create collection-scoped feeds
by selecting Artifacts, then Create Feed, and selecting a type of feed within Scope.":::
While we do recommend the usage of project-scoped feeds in alignment with the rest of Azure DevOps
offerings, you can again create, manage, and use collection-scoped feeds via UI and various REST APIs.
Please see our feeds documentation for more information.

Configure upstream sources for Universal Packages


Now you can configure your Azure Artifacts feeds to automatically download Universal Packages from
upstream sources on demand.
Previously, you could configure upstream sources on your feed for NuGet, Python, Maven, and npm
packages, but not for Universal Packages. This was due to a difference in the storage technology used
for Universal Packages, which support much larger packages than other supported package types.
You can now configure upstream sources for Universal Packages in the same way as for other package
types; open your feed settings, click Upstream sources -> Add upstream source -> and choose the
source type that is right for you. You will see Universal Packages (UPack) as a new option in the next
view (see image below). For more information, please see the upstream sources
configuration documentation.
Note that Universal Packages in upstream sources are only supported between feeds in the same
DevOps collection.

Update Package Version REST API now available for Maven packages


You can now use the Azure Artifacts "Update Package Version" REST API to update Maven package
versions. Previously, you could use the REST API to update package versions for NuGet, Maven, npm,
and Universal Packages, but not Maven packages.
To update Maven packages, use the HTTP PATCH command as follows.
PATCH https://pkgs.dev.azure.com/{collection}/{project?}/\_apis/packaging/feeds/{feedId}/maven/grou
ps/{groupId}/artifacts/{artifactId}/versions/{packageVersion}?api-version=5.1-preview.1
You can set the following:


如果需要,你还可以从微软Azure DevOps Server 的在线文档,查询更多的权威资料

------------------------------------------------------------

http://www.cnblogs.com/danzhang/  DevOps MVP 张洪君

------------------------------------------------------------


posted on 2021-02-28 21:50  danzhang  阅读(667)  评论(1编辑  收藏  举报

导航